1. Home
  2. Training Library
  3. DevOps
  4. Courses
  5. Red Hat Agile Integration Technical Overview

Red Hat 3Scale API Management

Developed with
Red Hat

Contents

keyboard_tab
Agile Integration Technical Overview
2
Red Hat AMQ
PREVIEW18m 46s
3
Combining It All Together

The course is part of this learning path

Applying AGILE Techniques to Build a DevOps Practice
course-steps
6
certification
1
description
1
play-arrow
Start course
Overview
DifficultyIntermediate
Duration2h 6m
Students196
Ratings
3.4/5
starstarstarstar-halfstar-border

Description

In this course, you will learn about the technical platforms that Red Hat offers for integration and messaging purposes. The course begins with a comprehensive look at the OpenShift Container Platform and then dives into Red Hat AMQ, which allows you to asynchronously connect various application platforms and exchange information reliably between them. Moving onto Red Hat Fuse, you will learn how to connect disparate systems through technologies such as Apache Camel. The course also looks at Red Hat's 3Scale Management Platform, a highly versatile system for controlling APIs. Finally, a demonstration shows you how these three technologies can be used through an example that implements a Camel Route to follow a twitter account and then translates the Twitter feed into a particular language.

Learning Objectives

  • Gain an in-depth knowledge of the OpenShift Container Platform
  • Learn about Red Hat's technical platforms and how they can be used

Intended Audience

This course is intended for:

  • System administrators, architects, developers and application administrators

 

Transcript

In this video, we're going to have a look at the features of the Red Hat 3scale API Management platform which is a very versatile platform that allows you to control your APIs. It's obvious that many, many applications in the modern microservices architecture communicate by using all sorts of APIs. It's not just RESTful but also other sorts and the HTTP being the picketers protocol that mostly those applications use, it's becoming quite demanding to, first of all, provide a unified entry point for those APIs but also it's a compelling option once a company develops a number of applications based around certain APIs to actually start offering those APIs to customers and end-users.

Now sometimes, obviously, those APIs are a core feature that a company provides and they may well be given away for free. But for certain more powerful functions, it's very lucrative to be able to create certain application plans that may include rate limiting, subscription fees, and an authentication obviously as well, and it's actually an approach that has been widely adopted by enterprises the world over.

So the 3scale API Management Platform is basically a set, again, of modular components that can be deployed in various topologies from being hosted in the public cloud to being deployed on-premise, and companies can use this product to control their APIs and literally bring them under control of some sort of governance that they can implement using the 3scale platform.

With the platform, it's possible not just to implement traffic control meaning basically filtering of which endpoints are available to what customers and such, it's also possible to monitor the use of the API. So analytics access control metrics, performance metrics, implement monetization, support developers, also customer developers with automatic publishing of documentation and similar, and, of course, since the entire API platform is built on OpenShift, it provides excellent whole-project integration all the way.

It's possible to easily integrate the 3scale API Management platform with all sorts of back-end APIs and use it to provide comprehensive security as an afterthought rather than complicate your applications and include all sorts of non-functional concerns into the development process. If we have a quick look at the 3scale API Management web interface, we can see that in terms of managing APIs, we can have a number of application plans deployed to the platform, whereby we can control things like Trial Periods, all sorts of fees, and, of course, even automatic approval. And in terms of audiences that we allow to use the API platform, we can also create certain usage rules such as, for example, Account Plans and Service Plans.

So in terms of Account Plans, it's possible to create various accounts for end users, can also even apply for accounts on their own, and then if we drill down a sample application plan, we can see that it's also possible to set up some pricing rules per usage even having certain limits, so different prices for different frequencies of use and rate limits. So again, Usage Limits can be per period and may have a certain maximum value associated with it such as, for example, a hundred requests per hour. We'll be looking at some of those features in more detail in the upcoming demonstration where we deploy a simple REST API and use the 3scale API Management Platform to expose it to different users and, of course, provide them with different rate limits and even filter certain application URLs.

An important feature of the 3scale API platform is the Developer Portal where we can actually automatically publish the documentation for our API and even provide self signup abilities for the customers which want to sign on for the usage of our APIs. Let's have a look at how to configure the 3scale API Management Platform to allow us access to the back-end API. So first, let's have a look at how it's deployed.

So I'm going to open the OpenShift web console, and have a look at the 3scale project. Over here we have several application components and, of course, Routes as well that give us access to various 3scale modules. The one I'm going to use is the 3scale Administration Console which I've already opened actually in another tab. So I'm going to log into it and we'll be presented with the default settings. If I have a look at the API configuration in the Dashboard, we can see that some applications have already been pre-created. Applications in this context, meaning the ability for end users to consume our API. But if I look at the integration settings here, I'll be able to see that in the APIcast configuration I'm actually using the default Echo API. So I want to change that actually to use my own back-end application. I'm going to have a look at where that one is deployed. So it's running in a different OpenShift cluster with the different backend URL which, for some reason, I do not want to share with my end users. So if we look at the Routes for my application, it's the Twitter API presenter. So if I open this in another window and issue some sort of API requests like health check, we can see the application is responding, let's issue a simple tweet request. There it is.

So I want to expose this application of mine through the API Management Platform which means basically I need to copy the base URL of my application and configure the private base URL to actually use my application's location, so that's just the base URL like this. The API Gateway is going to expose the application that I've just configured on two URLs. One is called Staging URL which allows me to test my settings and the other one is a Production URL which basically would be the external, the public access for API users.

I'll leave the Mapping Rules as they are right now which is going to confirm the settings, and issue a test request just to see if that works. So the green bars at the left-hand side confirm that the settings I've entered are valid and I've already got a simple curl URL that I can use to actually see if my application is responding through the API gateway. So I'm just going to issue a test request. Note that I haven't actually published valid certificates, so I'm going to tell curl to not complain about certificate trust. And there we are, this is my health response. Let's do another API request, API tweet English. Here it is. Note that the request has a user key appended to it. This is a key belonging to the default pre-created user in the consumer, in the API platform. 

So if I have a look at the Audience settings we see that there already is a Developer user owning one application. If I have a look at that developer's application, it's been assigned an API credential which was used in the request that I was just making to test my API settings. So what I'm going to do next is I'm going to apply some limits to this Developer's application. You see that this Developer is actually using the API service but is using the Basic Application Plan. So this Basic Application Plan already has the limits of ten hits per minute. But this limit is indiscriminate. If I switch back to the API settings for a moment and look at the Integration Configuration in the API cast configuration, you can see there's a mapping rule that says any GET request to any URL is mapped to a hit. 

Now back to the Application Plans. If I look at the Basic Application Plan setting over here in the Metrics, Methods, Limits & Pricing Rules, we see that the Hits metric actually has a limit imposed which is set to ten hits per minute. Now, this is indiscriminate which means that absolutely any request to my application will have been throttled at ten requests per minute. What I want to add is basically no more than two requests per minute for each of the application languages exposed. So English, Spanish and French.

What I have to do for this is to create a New Method and that I'll be accounting and then in the configuration for my API integration, define some new mapping rules. So I'm going to add new Mapping Rule for /api/tweet/en and I'll map that to agile-api-get. Another Mapping Rule /api/tweet/es and yet another, also map to the same method. So these three mappings will basically cause each request to any of those three URLs to count as one invocation of the agile API GET methods.

I'll just update the configuration, there we go. Let's check that the settings are saved and now I'm going to have to update the Basic Application Plan and add a new limit. You see that the method has already automatically appeared under the Metrics, Methods, Limits & Pricing Rules. Here we go. I'm just going to add a New Usage Limit, which is going to say we said two requests of any of those API methods per minute. There we go. And to update the application plan and let's see how this works.

So note that I'm invoking the /api/tweet URL. Let's do another one, Spanish this time, and let's try the French one. Now, this one should exceed the impose limits, not yet, let's do another one. There we go. We've now exceeded the limits and we're going to have to wait for a minute or so to be able to issue any more requests.

Notice that the health endpoint is not included in the API GET method so health is not restricted. It is restricted by the other limit which is ten requests of any kind per minute. So we've got a way to go before we'll be notified that there is an exceeded limit.

In addition to having the application plan equipped with some limits, we can also define Pricing Rules. If the user wants to exceed a certain volume we can actually impose a Pricing Rule saying, for example, anything above ten hits and up to a hundred will cost one cent. Here you go. Once the account using this plan has a pricing rule associated with it, signing up for the use of the application can be configured for the users to have to provide the credit card information and then in the platform configuration we can actually enable certain payment gateways and configure automatic billing for the customers to occur.

So this was a quick overview of how to configure the 3scale API Management with a custom API. Join us in the next video, we're going to have a look at how to bring all those technologies together in a fun and entertaining way.

About the Author
Students41438
Labs34
Courses94
Learning paths28

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.