AWS Systems Manager
What is the AWS Data Provider for SAP?
The course is part of this learning path
In this section of the AWS Certified: SAP on AWS Specialty learning path, we introduce you to strategies for operating and monitoring SAP workloads on AWS.
- Understand how to use Amazon CloudWatch, AWS CloudTrail, and AWS Config to manage and monitor SAP infrastructure on AWS
- Describe various AWS cost management tools including Cost Explorer, AWS Cost and Usage Reports, and AWS Budgets
- Understand how to automate patch and state operations for our SAP instances using AWS Systems Manager
- Explain how the AWS Data Provider for SAP is used to help gather performance-related data across AWS services
The AWS Certified: SAP on AWS Specialty certification has been designed for anyone who has experience managing and operating SAP workloads. Ideally you’ll also have some exposure to the design and implementation of SAP workloads on AWS, including migrating these workloads from on-premises environments. Many exam questions will require a solutions architect level of knowledge for many AWS services. All of the AWS Cloud concepts introduced in this course will be explained and reinforced from the ground up.
In this lecture, I want to discuss how the AWS Data Provider for SAP is able to collect metrics and data from Amazon EC2 instances and Amazon CloudWatch. As you’re probably aware, information like this is not freely available to any agent that needs it, and so of course the Data Provider agent will need to be given the appropriate and relevant permissions to read and retrieve this data. This access is granted through the use of IAM Roles.
If you are unfamiliar with IAM roles, then please see our existing course here for more information. During the creation of your role, which will be an AWS Service role type, with EC2 as the selected service, you can create a new permissions policy. It’s these permissions that define what the AWS Data Provider for SAP can access within your environment. AWS recommends using the following policy shown here:
Let’s take a closer look at what is actually happening here. Firstly, we have two Statement IDs (SID): VisualEditor0 and VisualEditor1. Under the SID of VisualEditor0, we can see that it allows 3 actions to be performed on ANY resource, these being:
- EC2:DescribeInstances - This will allow a great deal of information to be returned to the Data Provider about EC2 instances, more information on this API can be found here.
- Cloudwatch:GetMetricStatistics - This allows the retrieval of statistics for specified metrics, and so the Data Provider will use this to pull metric data back from CloudWatch
- EC2:DescribeVolumes - Finally, this will enable the Data Provider to see information and data relating to your EBS volumes.
The 2nd SID, VisualEditor1 allows the API of s3:GetObject from a particular resource residing in Amazon S3. You might be wondering what this resource is, and why it’s required as a part of the permissions policy. Well, as a part of the AWS Data Provider for SAP, there are some elements and components that you can customize within a configuration file known as config.properties, such as known EC2 types, or EBS volumes. When a new EC2 instance type is released, or a new EBS volume type, it can be added to this config.properties file to make the Data Provider aware of its existence.
When the Data Provider is running it generates a database by reading the available config.properties file, which can be read from a number of different sources, and it's read in a particular order.
- Firstly, it will try to read the config.properties file directly from the Data Provider application, via a JAR file
- Next, it will try to read the config.properties file from the S3 resource location we see in policy, which is why we have this permission in the policy! This is a file that AWS manages and updates
- And lastly, the file is read from default locations on Linux and Windows instances as shown here.
Once the permissions are granted, they can then be associated with your EC2 instance running the AWS Data Provider for SAP.
Stuart has been working within the IT industry for two decades covering a huge range of topic areas and technologies, from data center and network infrastructure design, to cloud architecture and implementation.
To date, Stuart has created 150+ courses relating to Cloud reaching over 180,000 students, mostly within the AWS category and with a heavy focus on security and compliance.
Stuart is a member of the AWS Community Builders Program for his contributions towards AWS.
He is AWS certified and accredited in addition to being a published author covering topics across the AWS landscape.
In January 2016 Stuart was awarded ‘Expert of the Year Award 2015’ from Experts Exchange for his knowledge share within cloud services to the community.
Stuart enjoys writing about cloud technologies and you will find many of his articles within our blog pages.