Integrating Amazon Web Services (AWS) data into cost management
Learn how to add your AWS integrations and RHEL metering
Abstract
Chapter 1. Creating an Amazon Web Services integration
To add an Amazon Web Services (AWS) account to cost management, you must configure your AWS account to provide metrics, then add your AWS account as a cloud integration from the cost management user interface.
To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
When you add an AWS integration, you create a read-only connection to AWS so that cost management can collect your data hourly. This process does not make any changes to the AWS account.
To add your AWS account to cost management as an integration, you must configure the following services on your AWS account to allow cost management to have access to your metrics:
- An S3 bucket to store cost and usage data reporting for cost management
- An Identity Access Management (IAM) policy and role for cost management to process the cost and usage data
Since you will complete some of the following steps in the AWS console, and some steps in the cost management user interface, keep both applications open in a web browser.
To ensure you have the most up to date information about AWS, refer to the AWS documentation.
If you are using RHEL metering, you must use the following steps to set up your account. You cannot use the steps to filter your data in Filtering your Amazon Web Services data before sending it to cost management. After you integrate your data with cost management, go to Adding RHEL metering to an AWS integration to finish configuring your integration for RHEL metering.
1.1. Adding an AWS account as an integration
Add an AWS integration so the cost management application can processes the Cost and Usage Reports from your AWS account. You can add an AWS integration automatically by providing your AWS account credentials, or you can configure cost management to filter the data that you send to Red Hat. Add an AWS integration so the hybrid committed spend application can processes the Cost and Usage Reports from your AWS account. You can add an AWS integration automatically by providing your AWS account credentials, or you can configure cost management to filter the data that you send to Red Hat.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click Add integration.
- On the Select integration type step, in the Add a cloud integration wizard, select . Click .
- Enter a name for the integration and click Next.
On the Select configuration step, select how you want to connect to your AWS integration.
- Select to provide your AWS account credentials and let Red Hat configure and manage your integration for you. Click .
- Select Chapter 2, Filtering your Amazon Web Services data before sending it to cost management. If you are using cost management to meter your RHEL subscription, you must select Manual Configuration. to customize your integration. You can filter your information before it is sent to cost management. For instructions on how to filter your data, see
- In the Select application step, select Cost management. Click .
- If you selected the account authorization method, on the Review details step, review the details and click . If you selected the manual configuration method, continue to the next step in the wizard and configure your S3 bucket.
1.2. Creating an S3 bucket and a data export
Create an Amazon S3 bucket with permissions configured to store your data exports.
Procedure
To create a data export, log in to your AWS account and complete the following steps:
- In the AWS S3 console, create a new S3 bucket or use an existing bucket. If you are configuring a new S3 bucket, accept the default settings.
- On the Create storage step, in the Add a cloud source wizard, paste the name of your S3 bucket and select the region that it was created in. Click .
- In the AWS Billing console, create a data export that will be delivered to your S3 bucket.
Enter the following values and accept the defaults for any other values:
- Export type: Legacy CUR export
- Report name: koku
- Include: resource IDs
- Time unit: Hourly
- Enable report data integration for: Amazon Redshift, Amazon QuickSight, and disable report data integration for Amazon Athena
- Compression type: GZIP
- S3 bucket: <the S3 bucket that you configured before>
Report path prefix: cost
NoteFor more details on configuration, see the AWS Billing and Cost Management documentation.
- In the Add a cloud integration wizard, on the Create cost and usage report step, click .
1.3. Activating AWS tags
To use tags to organize your AWS resources in the cost management application, activate your tags in AWS to allow them to be imported automatically.
Procedure
In the AWS Billing console:
- Open the Cost Allocation Tags section.
- Select the tags you want to use in the cost management application, and click Activate.
If your organization is converting systems from CentOS 7 to RHEL and using hourly billing, activate the
com_redhat_rhel
tag for your systems in the Cost Allocation Tags section of the AWS console.- After tagging the instances of RHEL you want to meter in AWS, select .
- In the Red Hat Hybrid Cloud Console Integrations wizard, click .
Additional resources
For more information about tagging, see Adding tags to an AWS resource.
1.4. Configure an IAM policy to enable minimal account access for Cost and Usage Reports
To provide data in the web interface and API, cost management must consume the Cost and Usage Reports produced by AWS. To only provide access to the stored information and nothing else, create an IAM policy and role for cost management to use.
Procedure
From the AWS Identity and Access Management (IAM) console, create a new IAM policy for the S3 bucket that you configured previously.
Select the JSON tab and paste the following content in the JSON policy text box:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:Get*", "s3:List*" ], "Resource": [ "arn:aws:s3:::<your_bucket_name>", 1 "arn:aws:s3:::<your_bucket_name>/*" ] }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "s3:ListBucket", "cur:DescribeReportDefinitions" ], "Resource": "*" } ] }
- 1
- Replace
<your_bucket_name>
in both locations with the name of the Amazon S3 bucket you configured previously.
- Enter a name for the policy and create the policy. Do not close the AWS IAM console. You will use it in the following steps.
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click .
In the AWS IAM console, create a new IAM role:
- Select Another AWS account as the type of trusted entity.
- Enter 589173575009 as the Account ID to provide the cost management application with read access to the AWS account cost data.
- Attach the IAM policy you just configured.
- Enter a role name and description.
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click .
In the AWS IAM console, in the Roles section, open the summary screen for the role you just created.
-
Copy the Role ARN, which is a string beginning with
arn:aws:
.
-
Copy the Role ARN, which is a string beginning with
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, paste your Role ARN and click Next.
- Review the details and click Finish to add the AWS account to cost management.
Cost management will begin collecting Cost and Usage data from your AWS account and any linked AWS accounts.
The data can take a few days to populate before it shows on the cost management dashboard.
1.4.1. Enabling additional account access for cost and usage consumption
Cost management can display additional data that might be useful. For example:
- Include the Action iam:ListAccountAliases to display an AWS account alias rather than an account number in cost management.
- Include the Actions organization:List* and organizations:Describe* to obtain the display names of AWS member accounts if you are using consolidated billing rather than the account ID.
The following configuration provides access to additional stored information and nothing else.
Procedure
- From the AWS Identity and Access Management (IAM) console, create a new IAM policy for the S3 bucket you configured before.
Select the JSON tab and paste the following content in the JSON policy text box:
{ "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:Get*", "s3:List*" ], "Resource": [ "arn:aws:s3:::<your_bucket_name>", 1 "arn:aws:s3:::<your_bucket_name>/*" ] }, { "Sid": "VisualEditor1", "Effect": "Allow", "Action": [ "iam:ListAccountAliases", "s3:ListBucket", "cur:DescribeReportDefinitions", "organizations:List*", "organizations:Describe*" ], "Resource": "*" } ] }
- 1
- Replace
<your_bucket_name>
in both locations with the name of the Amazon s3 bucket you configured before.
The remainder of the configuration steps are the same as in Section 1.4, “Configure an IAM policy to enable minimal account access for Cost and Usage Reports”
1.5. Configuring AWS billing plans
By default, cost management calculates AWS cost according to your usage cost for that date. If you have a special billing arrangement with AWS such as amortized billing or blended rates, you can configure these calculations from the cost management settings page. This allows your cost reports to more accurately reflect your AWS billing.
For more information about AWS billing, see Understanding Consolidated Bills in the AWS documentation.
Cost management supports three cost calculation options to accommodate AWS billing plans:
- Unblended
- Your costs are calculated according to your usage cost for that date.
- Amortized (Default)
- Your recurring and upfront costs will be distributed evenly throughout the billing period.
- Blended
- Your costs are calculated according to AWS blended rates.
This procedure describes how to set your cost calculation to Amortized or Blended from the default Unblended.
Prerequisites
- AWS integration added to cost management.
- Access to Red Hat Hybrid Cloud Console as an Organization Administrator.
Procedure
- From Red Hat Hybrid Cloud Console, navigate to the cost management settings page.
- Under Show cost as select Amortized or Blended.
- Click Save.
Chapter 2. Filtering your Amazon Web Services data before sending it to cost management
To copy exports, object storage buckets, and filter your data to share only a subset of your billing information with Red Hat, configure a function script in AWS. This option is only recommended if your organization has third party data limitations.
To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
To configure your AWS account to be a cost management data integration:
- Create an AWS S3 bucket to store your cost data.
- Create an AWS S3 bucket to report your filtered cost management data.
- Configure IAM roles for your cost data bucket.
- Add your AWS integrations to Red Hat Hybrid Cloud Console.
- Configure IAM roles for AWS Athena.
- Enable Athena.
- Create Lambda tasks for Athena to export filtered data to your S3 bucket.
Because you must complete some of the following steps in the AWS Console, and some steps in the cost management user interface, keep both applications open in a web browser.
If you are using RHEL metering, do not complete the following steps. Use the unfiltered steps instead.
2.1. Adding an AWS account as an integration
Add an AWS integration so the cost management application can processes the Cost and Usage Reports from your AWS account. You can add an AWS integration automatically by providing your AWS account credentials, or you can configure cost management to filter the data that you send to Red Hat.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click Add integration.
- On the Select integration type step, in the Add a cloud integration wizard, select . Click .
Enter a name for the integration and click Next.
- Select to customize your integration. For example, you can filter your information before it is sent to cost management. Click .
- In the Select application step, select Cost management. Click .
2.2. Creating an AWS S3 bucket for storing your cost data
Create an Amazon S3 bucket with permissions configured to store billing reports.
Procedure
- Log in to your AWS account to begin configuring cost and usage reporting.
- In the AWS S3 console, create a new S3 bucket or use an existing bucket. If you are configuring a new S3 bucket, accept the default settings.
In the AWS Billing console, create a data export that will be delivered to your S3 bucket. Specify the following values and accept the defaults for any other values:
- Report name: <rh_cost_report> (note this name as you will use it later)
- Additional report details: Include resource IDs
- S3 bucket: <the S3 bucket you configured previously>
- Time granularity: Hourly
- Enable report data integration for: Amazon Redshift, Amazon QuickSight (do not enable report data integration for Amazon Athena)
- Compression type: GZIP
Report path prefix: cost
NoteSee the AWS Billing and Cost Management documentation for more details on configuration.
- On the Create storage step, in the Add a cloud integration wizard, paste the name of your S3 bucket and select the region that it was created in. Click .
- In the Add a cloud integration wizard, on the Create cost and usage report step, click .
2.3. Creating a data export for filtered data reporting
To create a data export in AWS, set up Athena and Lambda functions to filter the data. This process delivers your data export to your S3 bucket. Complete the following steps:
Procedure
- Log in to your AWS account.
- In the AWS S3 console, create a data export that will be delivered to your S3 bucket.
- Enter a report name. Save this name. You will use it later.
- Click include resource IDs.
- Click Next.
- From Configure S3 Bucket, click Configure. Create a bucket and apply the default policy.
- Click Save.
- On the Create storage step, in the Add a cloud integration wizard, paste the name of your S3 bucket and select the region that it was created in and click .
- On the Create cost and usage report step in the Add a cloud integration wizard, select I wish to manually customize the CUR sent to Cost Management and click .
2.4. Activating AWS tags
To use tags to organize your AWS resources in the cost management application, activate your tags in AWS to allow them to be imported automatically.
Procedure
In the AWS Billing console:
- Open the Cost Allocation Tags section.
- Select the tags you want to use in the cost management application, and click Activate.
If your organization is converting systems from CentOS 7 to RHEL and using hourly billing, activate the
com_redhat_rhel
tag for your systems in the Cost Allocation Tags section of the AWS console.- After tagging the instances of RHEL you want to meter in AWS, select .
- In the Red Hat Hybrid Cloud Console Integrations wizard, click .
Additional resources
For more information about tagging, see Adding tags to an AWS resource.
2.5. Enabling minimal account access for cost and usage consumption
For cost management to provide data, it must consume the Cost and Usage Reports produced by AWS. For cost management to obtain this data with a minimal amount of access, create an IAM policy and role for cost management to use. This configuration only provides access to the stored information.
Procedure
From the AWS Identity and Access Management (IAM) console, create a new IAM policy for the S3 bucket that you configured.
Select the JSON tab and paste the following content in JSON policy:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetBucketLocation", "s3:ListAllMyBuckets" ], "Resource": "arn:aws:s3:::*" }, { "Effect": "Allow", "Action": "s3:*", "Resource": [ "arn:aws:s3:::<your_bucket_name>", "arn:aws:s3:::<your_bucket_name>/*"1 ] } ] }
- 1
- Replace
<your_bucket_name>
in both locations with the name of the Amazon S3 bucket you configured for storing your filtered data.
- Provide a name for the policy and complete the creation of the policy. Keep the AWS IAM console open. You will need it for the next step.
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click .
In the AWS IAM console, create a new IAM role:
- For the type of trusted entity, select AWS account.
- Enter 589173575009 as the Account ID to provide the cost management application with read access to the AWS account cost data.
- Attach the IAM policy you just configured.
- Enter a role name and description and finish creating the role.
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, click .
-
In the AWS IAM console, from Roles, open the summary screen for the role you just created and copy the Role ARN. It is a string beginning with
arn:aws:
. - In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, enter the ARN on the Enter ARN page and click .
- Review the details of your cloud integration and click .
Next steps
Return to AWS to customize your AWS data export by configuring Athena and Lambda to filter your reports.
2.6. Enabling account access for Athena
To provide data within the web interface and API, create an IAM policy and role for hybrid committed spend to use. This configuration provides access to the stored information and nothing else.
Procedure
From the AWS Identity and Access Management (IAM) console, create an IAM policy for the Athena Lambda functions you will configure.
Select the JSON tab and paste the following content in the JSON policy text box:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "athena:*" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "glue:CreateDatabase", "glue:DeleteDatabase", "glue:GetDatabase", "glue:GetDatabases", "glue:UpdateDatabase", "glue:CreateTable", "glue:DeleteTable", "glue:BatchDeleteTable", "glue:UpdateTable", "glue:GetTable", "glue:GetTables", "glue:BatchCreatePartition", "glue:CreatePartition", "glue:DeletePartition", "glue:BatchDeletePartition", "glue:UpdatePartition", "glue:GetPartition", "glue:GetPartitions", "glue:BatchGetPartition" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "s3:GetBucketLocation", "s3:GetObject", "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:ListMultipartUploadParts", "s3:AbortMultipartUpload", "s3:CreateBucket", "s3:PutObject", "s3:PutBucketPublicAccessBlock" ], "Resource": [ "arn:aws:s3:::CHANGE-ME*"1 ] }, { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::CHANGE-ME*"2 ] }, { "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:GetBucketLocation", "s3:ListAllMyBuckets" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "sns:ListTopics", "sns:GetTopicAttributes" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "cloudwatch:PutMetricAlarm", "cloudwatch:DescribeAlarms", "cloudwatch:DeleteAlarms", "cloudwatch:GetMetricData" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "lakeformation:GetDataAccess" ], "Resource": [ "*" ] }, { "Effect": "Allow", "Action": [ "logs:*" ], "Resource": "*" } ] }
- Provide a name for the policy and complete the creation of the policy. Keep the AWS IAM console open because you will need it for the next step.
In the AWS IAM console, create a new IAM role:
- For the type of trusted entity, select AWS service.
- Select Lambda.
- Attach the IAM policy you just configured.
- Enter a role name and description and finish creating the role.
2.6.1. Configuring Athena for report generation
Configuring Athena to provide a filtered data export for cost management.
The following configuration only provides access to additional stored information. It does not provide access to anything else:
Procedure
-
In the AWS S3 console, navigate to the filtered bucket that you created and download the
crawler-cfn.yml
file. - From Cloudformation in the AWS console, create a new stack.
- Select Template as Ready.
-
Upload the
crawler-cfn.yml
file that you previously downloaded. This should load immediately. - Click Next.
- Enter a name and click Next.
- Click I acknowledge that AWS Cloudformation might create IAM resources and then click Submit.
2.6.2. Creating a Lambda function for Athena
You must create a Lambda function that queries the data export for your Red Hat related expenses and creates a report of your filtered expenses.
Procedure
- Navigate to Lambda in the AWS console and click Create function.
- Click Author from scratch.
- Enter a name your function.
- From the Runtime dropdown, Select python 3.7.
- Select x86_64 as the Architecture.
- Under Permissions select the Athena role you created.
- Click Create function.
Paste the following code to the function:
import boto3 import uuid import json from datetime import datetime now = datetime.now() year = now.strftime("%Y") month = now.strftime("%m") day = now.strftime("%d") # Vars to Change! integration_uuid = <your_integration_uuid> # integration_uuid bucket = <your_S3_Bucket_Name> # Bucket created for query results database = 'athenacurcfn_athena_cost_and_usage' # Database to execute athena queries output=f's3://{bucket}/{year}/{month}/{day}/{uuid.uuid4()}' # Output location for query results # Athena query query = f"SELECT * FROM {database}.koku_athena WHERE ((bill_billing_entity = 'AWS Marketplace' AND line_item_legal_entity like '%Red Hat%') OR (line_item_legal_entity like '%Amazon Web Services%' AND line_item_line_item_description like '%Red Hat%') OR (line_item_legal_entity like '%Amazon Web Services%' AND line_item_line_item_description like '%RHEL%') OR (line_item_legal_entity like '%AWS%' AND line_item_line_item_description like '%Red Hat%') OR (line_item_legal_entity like '%AWS%' AND line_item_line_item_description like '%RHEL%') OR (line_item_legal_entity like '%AWS%' AND product_product_name like '%Red Hat%') OR (line_item_legal_entity like '%Amazon Web Services%' AND product_product_name like '%Red Hat%')) AND year = '{year}' AND month = '{month}'" def lambda_handler(event, context): # Initiate Boto3 athena Client athena_client = boto3.client('athena') # Trigger athena query response = athena_client.start_query_execution( QueryString=query, QueryExecutionContext={ 'Database': database }, ResultConfiguration={ 'OutputLocation': output } ) # Save query execution to s3 object s3 = boto3.client('s3') json_object = {"integration_uuid": integration_uuid, "bill_year": year, "bill_month": month, "query_execution_id": response.get("QueryExecutionId"), "result_prefix": output} s3.put_object( Body=json.dumps(json_object), Bucket=bucket, Key='query-data.json' ) return json_object
Replace
<your_integration_uuid>
with the UUID from the integration you created on console.redhat.com. Replace<your_S3_Bucket_Name>
with the name of the S3 bucket you created to store reports.- Click Deploy to test the function.
2.6.3. Creating a Lambda function to post the report files
You must create a Lambda function to post your report files to the S3 bucket that you created.
Procedure
- Navigate to Lambda in the AWS console and click Create function.
- Click Author from scratch
- Enter a name your function
- From the Runtime dropdown, Select python 3.7.
- Select x86_64 as the Architecture.
- Under Permissions select the Athena role you created.
- Click Create function.
Paste the following code to the function:
import boto3 import json import requests from botocore.exceptions import ClientError def get_credentials(secret_name, region_name): session = boto3.session.Session() client = session.client( service_name='secretsmanager', region_name=region_name ) try: get_secret_value_response = client.get_secret_value( SecretId=secret_name ) except ClientError as e: raise e secret = get_secret_value_response['SecretString'] return secret secret_name = "CHANGEME" region_name = "us-east-1" secret = get_credentials(secret_name, region_name) json_creds = json.loads(secret) USER = json_creds.get("<your_username>") # console.redhat.com Username PASS = json_creds.get("<your_password>") # console.redhat.com Password bucket = "<your_S3_Bucket_Name>" # Bucket for athena query results def lambda_handler(event, context): # Initiate Boto3 s3 and fetch query file s3_resource = boto3.resource('s3') json_content = json.loads(s3_resource.Object(bucket, 'query-data.json').get()['Body'].read().decode('utf-8')) # Initiate Boto3 athena Client and attempt to fetch athena results athena_client = boto3.client('athena') try: athena_results = athena_client.get_query_execution(QueryExecutionId=json_content["query_execution_id"]) except Exception as e: return f"Error fetching athena query results: {e} \n Consider increasing the time between running and fetching results" reports_list = [] prefix = json_content["result_prefix"].split(f'{bucket}/')[-1] # Initiate Boto3 s3 client s3_client = boto3.client('s3') result_data = s3_client.list_objects(Bucket=bucket, Prefix=prefix) for item in result_data.get("Contents"): if item.get("Key").endswith(".csv"): reports_list.append(item.get("Key")) # Post results to console.redhat.com API url = "https://console.redhat.com/api/cost-management/v1/ingress/reports/" json_data = {"source": json_content["integration_uuid"], "reports_list": reports_list, "bill_year": json_content["bill_year"], "bill_month": json_content["bill_month"]} resp = requests.post(url, json=json_data, auth=(USER, PASS)) return resp
Replace
<your_username>
with your username for console.redhat.com. Replace<your_password>
with your password for console.redhat.com. Replace<your_S3_Bucket_Name>
with the name of the S3 bucket that you created to store reports.- Click Deploy to test the function.
Chapter 3. Next steps for managing your costs
After adding your OpenShift Container Platform and Amazon Web Services data, cost management cost shows cost data by integration and cost and usage that is related to running your OpenShift Container Platform clusters on their platform. If you’re using an AWS savings plan for the EC2 instances running OpenShift nodes, cost management defaults to using the savings plan cost.
On the cost management Overview page, your cost data is sorted into OpenShift and Infrastructure tabs. Select Perspective to toggle through different views of your cost data.
You can also use the global navigation menu to view additional details about your costs by cloud provider.
Additional resources
3.1. Limiting access to cost management resources
After you add and configure integrations in cost management, you can limit access to cost data and resources.
You might not want users to have access to all of your cost data. Instead, you can grant users access only to data that is specific to their projects or organizations. With role-based access control, you can limit the visibility of resources in cost management reports. For example, you can restrict a user’s view to only AWS integrations, rather than the entire environment.
To learn how to limit access, see the more in-depth guide Limiting access to cost management resources.
3.2. Configuring tagging for your integrations
The cost management application tracks cloud and infrastructure costs with tags. Tags are also known as labels in OpenShift.
You can refine tags in cost management to filter and attribute resources, organize your resources by cost, and allocate costs to different parts of your cloud infrastructure.
You can only configure tags and labels directly on an integration. You can choose the tags that you activate in cost management, however, you cannot edit tags and labels in the cost management application.
To learn more about the following topics, see Managing cost data using tagging:
- Planning your tagging strategy to organize your view of cost data
- Understanding how cost management associates tags
- Configuring tags and labels on your integrations
3.3. Configuring cost models to accurately report costs
Now that you configured your integrations to collect cost and usage data in cost management, you can configure cost models to associate prices to metrics and usage.
A cost model is a framework that uses raw costs and metrics to define calculations for the costs in cost management. You can record, categorize, and distribute the costs that the cost model generates to specific customers, business units, or projects.
In Cost Models, you can complete the following tasks:
- Classifying your costs as infrastructure or supplementary costs
- Capturing monthly costs for OpenShift nodes and clusters
- Applying a markup to account for additional support costs
To learn how to configure a cost model, see Using cost models.
3.4. Visualizing your costs with Cost Explorer
Use cost management Cost Explorer to create custom graphs of time-scaled cost and usage information and ultimately better visualize and interpret your costs.
To learn more about the following topics, see Visualizing your costs using Cost Explorer:
- Using Cost Explorer to identify abnormal events
- Understanding how your cost data changes over time
- Creating custom bar charts of your cost and usage data
- Exporting custom cost data tables
Chapter 4. Updating an integration
If you have added an integration to cost management and want to make changes to it, you can add or remove the applications associated with your integrations in Red Hat Hybrid Cloud Console.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings .
- Click .
- Click the more options menu for your integration. Click Edit.
- In Metered Product, select Red Hat Enterprise Linux from the drop-down to activate metering.
4.1. Adding RHEL metering to an AWS integration
If you converted from a compatible third-party Linux distribution to Red Hat Enterprise Linux (RHEL) and purchased the RHEL for third party migration listing in Amazon Web Services (AWS), you can add RHEL metering to an AWS integration.
With RHEL metering, Red Hat processes your bill to meter your hourly RHEL usage associated with a Red Hat offering in AWS.
Procedure
- In AWS, tag your instances of RHEL that you want to meter. For more information about tagging your instances of RHEL in AWS, see Adding tags to an AWS resource.
- From Red Hat Hybrid Cloud Console, click Settings .
- Click .
- Click the more options menu for your integration. Click Edit.
- In Metered Product, select Red Hat Enterprise Linux from the drop-down to activate metering.
Providing feedback on Red Hat documentation
We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.
Prerequisites
- You are logged in to the Red Hat Customer Portal.
Procedure
To provide feedback, perform the following steps:
- Click the following link: Create Issue.
- Describe the issue or enhancement in the Summary text box.
- Provide details about the issue or requested enhancement in the Description text box.
- Type your name in the Reporter text box.
- Click the Create button.
This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.