搜索

此内容没有您所选择的语言版本。

Chapter 2. Creating your Amazon Web Services integration: Advanced

download PDF
Important

If you created an AWS integration by using the basic path, do not complete the following steps. Your AWS integration is already complete.

If you are using RHEL metering, after you integrate your data with cost management, go to Adding RHEL metering to an AWS integration to finish configuring your integration for RHEL metering.

To share a subset of your billing data with Red Hat, you can configure a function script in AWS. This script will filter your billing data and export it to object storage so that cost management can then access and read the filtered data. Add your AWS integration to cost management from the Integrations page.

AWS is a third-party product and its UI and documentation can change. The instructions for configuring third-party integrations are correct at the time of publishing. For the most up-to-date information, see the AWS documentation.

Prerequisites

2.1. Adding an AWS account as an integration

Add an AWS integration so cost management can process the Cost and Usage Reports from your AWS account. You can add an AWS integration automatically by providing your AWS account credentials, or you can configure cost management to filter the data that you send to Red Hat.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. On the Select integration type step, in the Add a cloud integration wizard, select Amazon Web Services. Click Next.
  4. Enter a name for the integration and click Next.
  5. On the Select configuration step, select how you want to connect to your AWS integration.

    • Select Manual configuration to customize your integration. If you are using cost management to meter your RHEL subscription, you must select Manual Configuration. Click Next.
  6. In the Select application step, select Cost management. Click Next.

2.2. Creating an AWS S3 bucket to store your Athena billing data

Create an Amazon S3 bucket with permissions configured to store Athena billing reports.

Procedure

  1. Log in to your AWS account.
  2. In the AWS Billing console, create a data export that will be delivered to your S3 bucket. Specify the following values and accept the defaults for any other values:

    • Export type: Legacy CUR export
    • Report name: <rh_cost_report> (note this name as you will use it later)
    • Additional report details: Include resource IDs
    • S3 bucket: Select an S3 bucket you configured previously or create a bucket and accept the default settings.
    • Time granularity: Hourly
    • Enable report data integration for: Amazon Athena, which is required for lambda queries
    • Compression type: Parquet
    • Report path prefix: cost

      Note

      For more details on configuration, see the AWS Billing and Cost Management documentation.

2.3. Creating a bucket to store filtered data reporting

To share your filtered data with Red Hat, you must create a second bucket to store the data.

In your AWS account:

  1. Log in to your AWS account.
  2. From Configure S3 Bucket, click Configure. Create a bucket and apply the default policy.
  3. Click Save.

In cost management:

  1. On the Create storage step, paste the name of your S3 bucket and select the region that it was created in and click Next.
  2. On the Create cost and usage report step in the Add a cloud integration wizard, select I wish to manually customize the CUR sent to Cost Management.
  3. Click Next.

2.4. Activating AWS tags

To use tags to organize your AWS resources in the cost management application, activate your tags in AWS to allow them to be imported automatically.

Procedure

  1. In the AWS Billing console:

    1. Open the Cost Allocation Tags section.
    2. Select the tags you want to use in the cost management application, and click Activate.
  2. If your organization is converting systems from CentOS 7 to RHEL and using hourly billing, activate the com_redhat_rhel tag for your systems in the Cost Allocation Tags section of the AWS console.

    1. After tagging the instances of RHEL you want to meter in AWS, select Include RHEL usage.
  3. In the Red Hat Hybrid Cloud Console Integrations wizard, select Include RHEL usage.

Additional resources

For more information about tagging, see Adding tags to an AWS resource.

2.5. Configuring an IAM policy to enable account access for Cost and Usage Reports

Cost management needs Cost and Usage Reports produced by AWS to display data. To provide the correct access, create an IAM policy and role in AWS, which provides access only to the stored information.

Cost management can also display additional data. For example:

  • Include the Action iam:ListAccountAliases to display an AWS account alias rather than an account number.
  • If you are using consolidated billing rather than the account ID, include the Actions organization:List* and organizations:Describe* to find the display names of AWS member accounts.

In cost management:

  1. In the Add a cloud integration wizard, select the additional data points you want to be included.
  2. Click Next.
  3. Copy the JSON output that is generated based on your selections.

In the AWS Identity and Access Management console:

  1. From the AWS Identity and Access Management (IAM) console, create a new IAM policy for the S3 bucket that you configured before.

    1. Select the JSON tab and paste the JSON policy which you copied from the Red Hat Hybrid Cloud Console Add a cloud integration wizard:

      {
        "Version": "2012-10-17",
        "Statement": [
          {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
              "s3:Get*",
              "s3:List*"
            ],
              "Resource": [
              "arn:aws:s3:::<your_bucket_name>", 1
              "arn:aws:s3:::<your_bucket_name>/*"
            ]
          },
      
          {
            "Sid": "VisualEditor1",
            "Effect": "Allow",
            "Action": [
              "s3:ListBucket",
              "cur:DescribeReportDefinitions"
            ],
            "Resource": "*"
          }
        ]
      }
    2. Enter a name for the policy and create the policy. Do not close the AWS IAM console. You will use it in the following steps.

In cost management:

  1. In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click Next.

In the AWS Identity and Access Management console:

  1. In the AWS IAM console, create a new IAM role:

    1. Select Another AWS account as the type of trusted entity.
    2. Enter 589173575009 as the Account ID to give Red Hat Hybrid Cloud Console read access to the AWS account cost data.

In cost management:

  1. Copy your external ID from the Create IAM role step in the wizard.

In the AWS Identity and Access Management console:

  1. Enter your external ID in the External ID field.
  2. Attach the IAM policy you just configured.
  3. Enter a role name and description.

In cost management:

  1. In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click Next.

In the AWS Identity and Access Management console:

  1. In the AWS IAM console, in the Roles section, open the summary screen for the role you just created.

    1. Copy the Role ARN, which is a string beginning with arn:aws:.

In cost management:

  1. In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, paste your Role ARN and click Next.
  2. Review the details of your cloud integration and click Add.

Next steps

Return to AWS to customize your AWS data export by configuring Athena and Lambda to filter your reports.

2.6. Enabling account access for Athena

Create an IAM policy and role for hybrid committed spend to use. This configuration provides access to the stored information and nothing else.

Procedure

  1. From the AWS Identity and Access Management (IAM) console, create an IAM policy for the Athena Lambda functions you will configure.

    1. Select the JSON tab and paste the following content in the JSON policy text box:

      {
      	"Version": "2012-10-17",
      	"Statement": [
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"athena:*"
              	],
              	"Resource": [
                  	"*"
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"glue:CreateDatabase",
                  	"glue:DeleteDatabase",
                  	"glue:GetDatabase",
                  	"glue:GetDatabases",
                  	"glue:UpdateDatabase",
                  	"glue:CreateTable",
                  	"glue:DeleteTable",
                  	"glue:BatchDeleteTable",
                  	"glue:UpdateTable",
                  	"glue:GetTable",
                  	"glue:GetTables",
                  	"glue:BatchCreatePartition",
                  	"glue:CreatePartition",
                  	"glue:DeletePartition",
                  	"glue:BatchDeletePartition",
                  	"glue:UpdatePartition",
                  	"glue:GetPartition",
                  	"glue:GetPartitions",
                  	"glue:BatchGetPartition"
              	],
              	"Resource": [
                  	"*"
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"s3:GetBucketLocation",
                  	"s3:GetObject",
                  	"s3:ListBucket",
                  	"s3:ListBucketMultipartUploads",
                  	"s3:ListMultipartUploadParts",
                  	"s3:AbortMultipartUpload",
                  	"s3:CreateBucket",
                  	"s3:PutObject",
                  	"s3:PutBucketPublicAccessBlock"
              	],
              	"Resource": [
                  	"arn:aws:s3:::CHANGE-ME*"1
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"s3:GetObject",
                  	"s3:ListBucket"
              	],
              	"Resource": [
                  	"arn:aws:s3:::CHANGE-ME*"2
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"s3:ListBucket",
                  	"s3:GetBucketLocation",
                  	"s3:ListAllMyBuckets"
              	],
              	"Resource": [
                  	"*"
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"sns:ListTopics",
                  	"sns:GetTopicAttributes"
              	],
              	"Resource": [
                  	"*"
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"cloudwatch:PutMetricAlarm",
                  	"cloudwatch:DescribeAlarms",
                  	"cloudwatch:DeleteAlarms",
                  	"cloudwatch:GetMetricData"
              	],
              	"Resource": [
                  	"*"
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"lakeformation:GetDataAccess"
              	],
              	"Resource": [
                  	"*"
              	]
          	},
          	{
              	"Effect": "Allow",
              	"Action": [
                  	"logs:*"
              	],
              	"Resource": "*"
          	}
      	]
      }
      1 1 1 2
      Replace CHANGE-ME* in both locations with the ARN for the S3 bucket you configured in step 2.2.
    2. Name the policy and complete the creation of the policy. Keep the AWS IAM console open because you will need it for the next step.
  2. In the AWS IAM console, create a new IAM role:

    1. For the type of trusted entity, select AWS service.
    2. Select Lambda.
    3. Attach the IAM policy you just configured.
    4. Enter a role name and description and finish creating the role.
  3. Store your login information in AWS Secrets Manager and add it to the role you created.

    1. Select Secret type: Other type of secret.
    2. Create a key for your Red Hat Hybrid Cloud Console client_id.
    3. Create a key for your Red Hat Hybrid Cloud Console client_secret.
    4. Add the values for your user name and password to the appropriate key.
    5. Click Continue, then name and store your secret.
    6. Update the role you created for your Lambda functions. Include the following code to reference the secret stored in AWS Secrets Manager:

      {
          "Sid": "VisualEditor3",
          "Effect": "Allow",
          "Action": [
              "secretsmanager:GetSecretValue",
              "secretsmanager:DescribeSecret"
          ],
          "Resource": "*"
      }

2.6.1. Configuring Athena for report generation

Configuring Athena to provide a filtered data export for cost management.

The following configuration only provides access to additional stored information. It does not provide access to anything else:

Procedure

  1. In the AWS S3 console, go to the S3 bucket you configured in step 2.2. Then, go to the crawler-cfn.yml file, which is in the path created by your data export you configured. For example: {bucket-name}/{S3_path_prefix}/{export_name}/crawler-cfn.yml. Copy the Object URL for the crawler-cfn.yml.
  2. From Cloudformation in the AWS console, create a stack with new resources:

    1. Choose an existing template.
    2. Select Specify Template.
    3. Select Template Source: Amazon S3 URL.
    4. Paste the object URL you copied before.
  3. Enter a name and click Next.
  4. Click I acknowledge that AWS Cloudformation might create IAM resources and then click Submit.

2.6.2. Building an Athena query

Create an Athena query that queries the data export for your Red Hat expenses and creates a report of your filtered expenses.

You might need just the query included with the example script, for example, if you are filtering for Red Hat spending. If you need something more advanced, create a custom query. If you are using RHEL metering, you must adjust the query to return data that is specific to your RHEL subscriptions. The following steps will guide you through creating a RHEL subscription query.

Example Athena query for Red Hat spend

SELECT *
    FROM <your_export_name>
    WHERE (
            bill_billing_entity = 'AWS Marketplace'
            AND line_item_legal_entity like '%Red Hat%'
        )
        OR (
            line_item_legal_entity like '%Amazon Web Services%'
            AND line_item_line_item_description like '%Red Hat%'
        )
        OR (
            line_item_legal_entity like '%Amazon Web Services%'
            AND line_item_line_item_description like '%RHEL%'
        )
        OR (
            line_item_legal_entity like '%AWS%'
            AND line_item_line_item_description like '%Red Hat%'
        )
        OR (
            line_item_legal_entity like '%AWS%'
            AND line_item_line_item_description like '%RHEL%'
        )
        OR (
            line_item_legal_entity like '%AWS%'
            AND product_product_name like '%Red Hat%'
        )
        OR (
            line_item_legal_entity like '%Amazon Web Services%'
            AND product_product_name like '%Red Hat%'
        )
        AND year = '2024'
        AND month = '07'

In your AWS account:

  1. Go to Amazon Athena from the editor tab.
  2. From the Data source menu, select AwsDataCatalog.
  3. From the Database menu, select your data export. Your data export name is prepended with athenacurcfn_ followed by your data export name. For example, {your_export_name}.
  4. Paste the following example query into the Query field. Replace the your_export_name value with your data export name.

    SELECT column_name
    FROM information_schema.columns
    WHERE table_name = '<your_export_name>'
    AND column_name LIKE 'resource_tags_%';
  5. Click Run. The results of this query returns all the tag related columns for your data set.
  6. Copy the tag column that matches the column used for your RHEL tags.
  7. Paste in the following example query. Replace the your_export_name, the tags column copied in the step before, and the year and month you want to query. The result returns EC2 instances appropriately tagged for RHEL subscriptions. Copy and save this query for use in the future Lambda function.

    SELECT *
            FROM <your_export_name>
            WHERE (
                line_item_product_code = 'AmazonEC2'
                AND strpos(lower(<rhel_tag_column_name>), 'com_redhat_rhel') > 0
            )
            AND year = '<year>'
            AND month = '<month>'

2.6.3. Creating a Lambda function for Athena

You must create a Lambda function that queries the data export for your Red Hat related expenses and creates a report of your filtered expenses.

Procedure

  1. In the AWS console, go to Lambda and click Create function.
  2. Click Author from scratch.
  3. Enter a name your function.
  4. From the Runtime menu, select the latest version of Python available.
  5. From the Architecture menu, select x86_64.
  6. Under Permissions select the Athena role you created.
  7. To add the query you built as part of the Lambda function, click Create function to save your progress.
  8. From the function Code tab, paste this script. Update the following lines:

    your_integration_external_id
    Enter the integration UUID you copied in the Enabling account access for cost and usage consumption step.
    bucket
    Enter the name of the S3 bucket you created to store filtered reports during the Creating a bucket for storing filtered data reporting step.
    database
    Enter the database name used in the Building your Athena query step.
    export_name
    Enter the name of your data export from when you created an AWS S3 bucket for storing your cost data.
  9. Update the default query with your custom one by replacing the where clause, for example:

    # Athena query
    query = f"SELECT * FROM {database}.{export_name} WHERE (line_item_product_code = 'AmazonEC2' AND strpos(lower(<rhel_tag_column_name>), 'com_redhat_rhel') > 0) AND year = '{year}' AND month = '{month}'"
  10. Click Deploy to test the function.

2.6.4. Creating a Lambda function to post the report files

You must create a second Lambda function to post your filtered reports in a bucket that Red Hat can access.

Procedure

  1. Go to Lambda in the AWS console and click Create function.
  2. Click Author from scratch.
  3. Enter a name your function.
  4. From the Runtime menu, select the latest version of Python available.
  5. Select x86_64 as the Architecture.
  6. Under Permissions select the Athena role you created.
  7. Click Create function.
  8. Paste this script into the function and replace the following lines:

    secret_name = "CHANGEME"
    Enter your secret name.
    bucket = "<your_S3_Bucket_Name>"
    Enter the name of the S3 bucket you created to store filtered reports during the Creating a bucket for storing filtered data reporting step.
  9. Click Deploy to test the function.

2.7. Creating event bridge schedules

You must trigger the Lambda functions you created by scheduling an AmazonEventBridge.

Procedure

  1. Create two AmazonEventBridge schedules to trigger each of the functions that you created. You must trigger these functions at different cadences so that the Athena query is completed before it sends the reports:

    1. Add a name and description.
    2. In the Group field, select Default.
    3. In the Occurrence field, select Recurring schedule.
    4. In the Type field, select Chron-based.
    5. Set the cron-based schedules 12 hours apart. The following example triggers the function at 9AM and 9PM, 0 9 * * ? * and 0 21 * * ? *.
    6. Set a flexible time window.
    7. Click Next.
  2. Set the Target detail to AWS Lambda invoke to associate this schedule with the Lambda function:

    1. Select the Lambda function you created before.
    2. Click Next.
  3. Enable the schedule:

    1. Configure the retry logic.
    2. Ignore the encryption.
    3. Set the permissions to Create new role on the fly.
    4. Click Next.
  4. Review your selections and click Create.

2.8. Creating additional cloud functions to collect finalized data

AWS sends final reports for the last month at the start of the following month. Send these finalized reports to Cost management, which will analyze the extra information.

Procedure

  1. Create Athena query for the Lambda function:

    1. Create a function for querying Athena.
    2. Select Author from scratch.
    3. Select the Python runtime.
    4. Select the x86_64 architecture.
    5. Select the role created before for permissions.
    6. Click Create.
  2. Click the Code tab to add a script to collect the finalized data.

    1. Copy the Athena query function and add it to the query. Update the <integration_uuid> with the integration_uuid from the integration you created on console.redhat.com, which you can find by going to the the Integrations page and clicking your integration. Update the BUCKET and DATABASE variables with the bucket name and databases you created. Then, update export_name with the name of the data export Athena query you created before.
    2. Remove the comment from the following code:

      # last_month = now.replace(day=1) - timedelta(days=1)
      # year = last_month.strftime("%Y")
      # month = last_month.strftime("%m")
      # day = last_month.strftime("%d")
      # file_name = 'finalized-data.json'
    3. Click Deploy. Then click Test to see the execution results.
  3. Create a Lambda function to post the report files to cost management:

    1. Select Author from scratch.
    2. Name your function.
    3. Select the Python runtime.
    4. Select the x86_64 architecture.
    5. Select the role created before for permissions.
    6. Click Create.
  4. Click the Code tab to add a script to post the finalized data.

    1. Copy the post function and add it to the query. Update the secret_name with the name of your secret in AWS Secrets Manager. Update the bucket with the bucket name you created.
    2. Remove the comment from the following code:

      # file_name = 'finalized_data.json'
    3. Click Deploy. Then click Test to see the execution results.
  5. Create an EventBridge schedule to trigger the two functions. For more information, see Section 2.7, “Creating event bridge schedules”.

    1. Set the EventBridge schedule to run one time a month on or after the 15th of the month because your AWS bill for the earlier period is final by that date. For example, (0 9 15 * ? *) and (0 21 15 * ? *).

After completing these steps, cost management will begin collecting Cost and Usage data from your AWS account and any linked AWS accounts.

Note

The data can take a few days to populate before it shows on the cost management dashboard.

Red Hat logoGithubRedditYoutubeTwitter

学习

尝试、购买和销售

社区

关于红帽文档

通过我们的产品和服务,以及可以信赖的内容,帮助红帽用户创新并实现他们的目标。

让开源更具包容性

红帽致力于替换我们的代码、文档和 Web 属性中存在问题的语言。欲了解更多详情,请参阅红帽博客.

關於紅帽

我们提供强化的解决方案,使企业能够更轻松地跨平台和环境(从核心数据中心到网络边缘)工作。

© 2024 Red Hat, Inc.