此内容没有您所选择的语言版本。
Chapter 2. Creating your Amazon Web Services integration: Advanced
If you created an AWS integration by using the basic path, do not complete the following steps. Your AWS integration is already complete.
If you are using RHEL metering, after you integrate your data with cost management, go to Adding RHEL metering to an AWS integration to finish configuring your integration for RHEL metering.
To share a subset of your billing data with Red Hat, you can configure a function script in AWS. This script will filter your billing data and export it to object storage so that cost management can then access and read the filtered data. Add your AWS integration to cost management from the Integrations page.
AWS is a third-party product and its UI and documentation can change. The instructions for configuring third-party integrations are correct at the time of publishing. For the most up-to-date information, see the AWS documentation.
Prerequisites
- You must have a Red Hat Hybrid Cloud Console service account.
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
2.1. Adding an AWS account as an integration
Add an AWS integration so cost management can process the Cost and Usage Reports from your AWS account. You can add an AWS integration automatically by providing your AWS account credentials, or you can configure cost management to filter the data that you send to Red Hat.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click .
- On the Select integration type step, in the Add a cloud integration wizard, select . Click .
- Enter a name for the integration and click .
On the Select configuration step, select how you want to connect to your AWS integration.
- Select must select Manual Configuration. Click . to customize your integration. If you are using cost management to meter your RHEL subscription, you
- In the Select application step, select Cost management. Click .
2.2. Creating an AWS S3 bucket to store your Athena billing data
Create an Amazon S3 bucket with permissions configured to store Athena billing reports.
Procedure
- Log in to your AWS account.
In the AWS Billing console, create a data export that will be delivered to your S3 bucket. Specify the following values and accept the defaults for any other values:
- Export type: Legacy CUR export
- Report name: <rh_cost_report> (note this name as you will use it later)
- Additional report details: Include resource IDs
- S3 bucket: Select an S3 bucket you configured previously or create a bucket and accept the default settings.
- Time granularity: Hourly
- Enable report data integration for: Amazon Athena, which is required for lambda queries
- Compression type: Parquet
Report path prefix: cost
NoteFor more details on configuration, see the AWS Billing and Cost Management documentation.
2.3. Creating a bucket to store filtered data reporting
To share your filtered data with Red Hat, you must create a second bucket to store the data.
In your AWS account:
- Log in to your AWS account.
- From Configure S3 Bucket, click . Create a bucket and apply the default policy.
- Click .
In cost management:
- On the Create storage step, paste the name of your S3 bucket and select the region that it was created in and click .
- On the Create cost and usage report step in the Add a cloud integration wizard, select I wish to manually customize the CUR sent to Cost Management.
- Click .
2.4. Activating AWS tags
To use tags to organize your AWS resources in the cost management application, activate your tags in AWS to allow them to be imported automatically.
Procedure
In the AWS Billing console:
- Open the Cost Allocation Tags section.
- Select the tags you want to use in the cost management application, and click .
If your organization is converting systems from CentOS 7 to RHEL and using hourly billing, activate the
com_redhat_rhel
tag for your systems in the Cost Allocation Tags section of the AWS console.- After tagging the instances of RHEL you want to meter in AWS, select .
- In the Red Hat Hybrid Cloud Console Integrations wizard, select Include RHEL usage.
Additional resources
For more information about tagging, see Adding tags to an AWS resource.
2.5. Configuring an IAM policy to enable account access for Cost and Usage Reports
Cost management needs Cost and Usage Reports produced by AWS to display data. To provide the correct access, create an IAM policy and role in AWS, which provides access only to the stored information.
Cost management can also display additional data. For example:
-
Include the Action
iam:ListAccountAliases
to display an AWS account alias rather than an account number. - If you are using consolidated billing rather than the account ID, include the Actions organization:List* and organizations:Describe* to find the display names of AWS member accounts.
In cost management:
- In the Add a cloud integration wizard, select the additional data points you want to be included.
- Click .
- Copy the JSON output that is generated based on your selections.
In the AWS Identity and Access Management console:
From the AWS Identity and Access Management (IAM) console, create a new IAM policy for the S3 bucket that you configured before.
Select the JSON tab and paste the JSON policy which you copied from the Red Hat Hybrid Cloud Console Add a cloud integration wizard:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:Get*", "s3:List*" ], "Resource": [ "arn:aws:s3:::<your_bucket_name>", 1 "arn:aws:s3:::<your_bucket_name>/*" ] }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "s3:ListBucket", "cur:DescribeReportDefinitions" ], "Resource": "*" } ] }
- Enter a name for the policy and create the policy. Do not close the AWS IAM console. You will use it in the following steps.
In cost management:
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click .
In the AWS Identity and Access Management console:
In the AWS IAM console, create a new IAM role:
- Select Another AWS account as the type of trusted entity.
- Enter 589173575009 as the Account ID to give Red Hat Hybrid Cloud Console read access to the AWS account cost data.
In cost management:
- Copy your external ID from the Create IAM role step in the wizard.
In the AWS Identity and Access Management console:
- Enter your external ID in the External ID field.
- Attach the IAM policy you just configured.
- Enter a role name and description.
In cost management:
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click .
In the AWS Identity and Access Management console:
In the AWS IAM console, in the Roles section, open the summary screen for the role you just created.
-
Copy the Role ARN, which is a string beginning with
arn:aws:
.
-
Copy the Role ARN, which is a string beginning with
In cost management:
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, paste your Role ARN and click .
- Review the details of your cloud integration and click .
Next steps
Return to AWS to customize your AWS data export by configuring Athena and Lambda to filter your reports.
2.6. Enabling account access for Athena
Create an IAM policy and role for hybrid committed spend to use. This configuration provides access to the stored information and nothing else.
Procedure
From the AWS Identity and Access Management (IAM) console, create an IAM policy for the Athena Lambda functions you will configure.
Select the JSON tab and paste the following content in the JSON policy text box:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "athena:*" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "glue:CreateDatabase", "glue:DeleteDatabase", "glue:GetDatabase", "glue:GetDatabases", "glue:UpdateDatabase", "glue:CreateTable", "glue:DeleteTable", "glue:BatchDeleteTable", "glue:UpdateTable", "glue:GetTable", "glue:GetTables", "glue:BatchCreatePartition", "glue:CreatePartition", "glue:DeletePartition", "glue:BatchDeletePartition", "glue:UpdatePartition", "glue:GetPartition", "glue:GetPartitions", "glue:BatchGetPartition" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "s3:GetBucketLocation", "s3:GetObject", "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:ListMultipartUploadParts", "s3:AbortMultipartUpload", "s3:CreateBucket", "s3:PutObject", "s3:PutBucketPublicAccessBlock" ], "Resource": [ "arn:aws:s3:::CHANGE-ME*"1 ] }, { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::CHANGE-ME*"2 ] }, { "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketLocation", "s3:ListAllMyBuckets" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "sns:ListTopics", "sns:GetTopicAttributes" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "cloudwatch:PutMetricAlarm", "cloudwatch:DescribeAlarms", "cloudwatch:DeleteAlarms", "cloudwatch:GetMetricData" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "lakeformation:GetDataAccess" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "logs:*" ], "Resource": "*" } ] }
- Name the policy and complete the creation of the policy. Keep the AWS IAM console open because you will need it for the next step.
In the AWS IAM console, create a new IAM role:
- For the type of trusted entity, select AWS service.
- Select Lambda.
- Attach the IAM policy you just configured.
- Enter a role name and description and finish creating the role.
Store your login information in AWS Secrets Manager and add it to the role you created.
- Select Secret type: Other type of secret.
-
Create a key for your Red Hat Hybrid Cloud Console
client_id
. -
Create a key for your Red Hat Hybrid Cloud Console
client_secret
. - Add the values for your user name and password to the appropriate key.
- Click , then name and store your secret.
Update the role you created for your Lambda functions. Include the following code to reference the secret stored in AWS Secrets Manager:
{ "Sid": "VisualEditor3", "Effect": "Allow", "Action": [ "secretsmanager:GetSecretValue", "secretsmanager:DescribeSecret" ], "Resource": "*" }
2.6.1. Configuring Athena for report generation
Configuring Athena to provide a filtered data export for cost management.
The following configuration only provides access to additional stored information. It does not provide access to anything else:
Procedure
-
In the AWS S3 console, go to the S3 bucket you configured in step 2.2. Then, go to the
crawler-cfn.yml
file, which is in the path created by your data export you configured. For example:{bucket-name}/{S3_path_prefix}/{export_name}/crawler-cfn.yml
. Copy the Object URL for thecrawler-cfn.yml
. From Cloudformation in the AWS console, create a stack with new resources:
- Choose an existing template.
- Select Specify Template.
- Select Template Source: Amazon S3 URL.
- Paste the object URL you copied before.
- Enter a name and click .
- Click and then click .
2.6.2. Building an Athena query
Create an Athena query that queries the data export for your Red Hat expenses and creates a report of your filtered expenses.
You might need just the query included with the example script, for example, if you are filtering for Red Hat spending. If you need something more advanced, create a custom query. If you are using RHEL metering, you must adjust the query to return data that is specific to your RHEL subscriptions. The following steps will guide you through creating a RHEL subscription query.
Example Athena query for Red Hat spend
SELECT * FROM <your_export_name> WHERE ( bill_billing_entity = 'AWS Marketplace' AND line_item_legal_entity like '%Red Hat%' ) OR ( line_item_legal_entity like '%Amazon Web Services%' AND line_item_line_item_description like '%Red Hat%' ) OR ( line_item_legal_entity like '%Amazon Web Services%' AND line_item_line_item_description like '%RHEL%' ) OR ( line_item_legal_entity like '%AWS%' AND line_item_line_item_description like '%Red Hat%' ) OR ( line_item_legal_entity like '%AWS%' AND line_item_line_item_description like '%RHEL%' ) OR ( line_item_legal_entity like '%AWS%' AND product_product_name like '%Red Hat%' ) OR ( line_item_legal_entity like '%Amazon Web Services%' AND product_product_name like '%Red Hat%' ) AND year = '2024' AND month = '07'
In your AWS account:
- Go to Amazon Athena from the editor tab.
- From the Data source menu, select AwsDataCatalog.
-
From the Database menu, select your data export. Your data export name is prepended with
athenacurcfn_
followed by your data export name. For example,{your_export_name}
. Paste the following example query into the Query field. Replace the
your_export_name
value with your data export name.SELECT column_name FROM information_schema.columns WHERE table_name = '<your_export_name>' AND column_name LIKE 'resource_tags_%';
- Click . The results of this query returns all the tag related columns for your data set.
- Copy the tag column that matches the column used for your RHEL tags.
Paste in the following example query. Replace the
your_export_name
, the tags column copied in the step before, and theyear
andmonth
you want to query. The result returns EC2 instances appropriately tagged for RHEL subscriptions. Copy and save this query for use in the future Lambda function.SELECT * FROM <your_export_name> WHERE ( line_item_product_code = 'AmazonEC2' AND strpos(lower(<rhel_tag_column_name>), 'com_redhat_rhel') > 0 ) AND year = '<year>' AND month = '<month>'
2.6.3. Creating a Lambda function for Athena
You must create a Lambda function that queries the data export for your Red Hat related expenses and creates a report of your filtered expenses.
Procedure
- In the AWS console, go to Lambda and click .
- Click .
- Enter a name your function.
- From the Runtime menu, select the latest version of Python available.
- From the Architecture menu, select x86_64.
- Under Permissions select the Athena role you created.
- To add the query you built as part of the Lambda function, click to save your progress.
From the function Code tab, paste this script. Update the following lines:
your_integration_external_id
- Enter the integration UUID you copied in the Enabling account access for cost and usage consumption step.
bucket
- Enter the name of the S3 bucket you created to store filtered reports during the Creating a bucket for storing filtered data reporting step.
database
- Enter the database name used in the Building your Athena query step.
export_name
- Enter the name of your data export from when you created an AWS S3 bucket for storing your cost data.
Update the default query with your custom one by replacing the
where
clause, for example:# Athena query query = f"SELECT * FROM {database}.{export_name} WHERE (line_item_product_code = 'AmazonEC2' AND strpos(lower(<rhel_tag_column_name>), 'com_redhat_rhel') > 0) AND year = '{year}' AND month = '{month}'"
- Click to test the function.
2.6.4. Creating a Lambda function to post the report files
You must create a second Lambda function to post your filtered reports in a bucket that Red Hat can access.
Procedure
- Go to Lambda in the AWS console and click .
- Click .
- Enter a name your function.
- From the Runtime menu, select the latest version of Python available.
- Select x86_64 as the Architecture.
- Under Permissions select the Athena role you created.
- Click .
Paste this script into the function and replace the following lines:
secret_name = "CHANGEME"
- Enter your secret name.
bucket = "<your_S3_Bucket_Name>"
- Enter the name of the S3 bucket you created to store filtered reports during the Creating a bucket for storing filtered data reporting step.
- Click to test the function.
2.7. Creating event bridge schedules
You must trigger the Lambda functions you created by scheduling an AmazonEventBridge.
Procedure
Create two AmazonEventBridge schedules to trigger each of the functions that you created. You must trigger these functions at different cadences so that the Athena query is completed before it sends the reports:
- Add a name and description.
- In the Group field, select Default.
- In the Occurrence field, select Recurring schedule.
- In the Type field, select Chron-based.
-
Set the cron-based schedules 12 hours apart. The following example triggers the function at 9AM and 9PM,
0 9 * * ? *
and0 21 * * ? *
. - Set a flexible time window.
- Click .
Set the Target detail to AWS Lambda invoke to associate this schedule with the Lambda function:
- Select the Lambda function you created before.
- Click .
Enable the schedule:
- Configure the retry logic.
- Ignore the encryption.
- Set the permissions to Create new role on the fly.
- Click .
- Review your selections and click .
2.8. Creating additional cloud functions to collect finalized data
AWS sends final reports for the last month at the start of the following month. Send these finalized reports to Cost management, which will analyze the extra information.
Procedure
Create Athena query for the Lambda function:
- Create a function for querying Athena.
- Select Author from scratch.
- Select the Python runtime.
- Select the x86_64 architecture.
- Select the role created before for permissions.
- Click .
Click the Code tab to add a script to collect the finalized data.
-
Copy the Athena query function and add it to the query. Update the
<integration_uuid>
with theintegration_uuid
from the integration you created on console.redhat.com, which you can find by going to the the Integrations page and clicking your integration. Update theBUCKET
andDATABASE
variables with the bucket name and databases you created. Then, updateexport_name
with the name of the data export Athena query you created before. Remove the comment from the following code:
# last_month = now.replace(day=1) - timedelta(days=1) # year = last_month.strftime("%Y") # month = last_month.strftime("%m") # day = last_month.strftime("%d") # file_name = 'finalized-data.json'
- Click . Then click to see the execution results.
-
Copy the Athena query function and add it to the query. Update the
Create a Lambda function to post the report files to cost management:
- Select Author from scratch.
- Name your function.
- Select the Python runtime.
- Select the x86_64 architecture.
- Select the role created before for permissions.
- Click .
Click the Code tab to add a script to post the finalized data.
-
Copy the post function and add it to the query. Update the
secret_name
with the name of your secret in AWS Secrets Manager. Update thebucket
with the bucket name you created. Remove the comment from the following code:
# file_name = 'finalized_data.json'
- Click . Then click to see the execution results.
-
Copy the post function and add it to the query. Update the
Create an EventBridge schedule to trigger the two functions. For more information, see Section 2.7, “Creating event bridge schedules”.
-
Set the EventBridge schedule to run one time a month on or after the 15th of the month because your AWS bill for the earlier period is final by that date. For example,
(0 9 15 * ? *)
and(0 21 15 * ? *)
.
-
Set the EventBridge schedule to run one time a month on or after the 15th of the month because your AWS bill for the earlier period is final by that date. For example,
After completing these steps, cost management will begin collecting Cost and Usage data from your AWS account and any linked AWS accounts.
The data can take a few days to populate before it shows on the cost management dashboard.