Integrating Google Cloud data into cost management
Learn how to add and configure your Google Cloud integration
Abstract
Part I. Creating a filtered or unfiltered Google Cloud integration Copy linkLink copied to clipboard!
Before you create an Google Cloud (GCP) integration, first decide if you want to create a filtered or unfiltered integration.
Unfiltered flow
An unfiltered integration enables cost management to directly read your billing reports from GCP. You can select the scope of the reports later on.
To create an unfiltered integration, go to Creating an unfiltered GCP integration.
Filtered flow
A filtered integration enables you to customize and filter your data before cost management reads it. Some customers use the filtered integration to share billing data with only certain Red Hat products. It is more complex to set up and configure a filtered integration than an unfiltered one.
If you create a filtered integration and manually customize your data, you must manually send CSV files to cost management. If you fail to send the CSV files, cost management will not be able to provide you with any insights or cost data. If you want cost management to automatically pull and process reports,, do not select I wish to manually customize the data set sent to cost management in the wizard.
To create a filtered integration, go to Creating a filtered GCP integration.
You must create either a filtered or unfiltered integration. Do not follow both sets of instructions.
Chapter 1. Creating an unfiltered Google Cloud integration Copy linkLink copied to clipboard!
In the Integrations page, you can create a GCP cloud integration and configure your GCP account to give cost management access
- If you want to create a filtered GCP integration, do not complete the following steps. Instead, go to Creating a filtered Google Cloud integration.
- Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.
Prerequisites
You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to cost management.
1.1. Adding your Google Cloud account as an integration Copy linkLink copied to clipboard!
You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
-
From Red Hat Hybrid Cloud Console, click Settings Menu
> Integrations.
- On the Settings page, in the Cloud tab, click .
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click .
- Enter a name for your integration. Click .
- In the Select application step, select Cost management and click .
1.2. Creating a Google Cloud project Copy linkLink copied to clipboard!
Create a Google Cloud project to collect and store your cost reports for Red Hat to consume.
Prerequisites
-
You must have the
resourcemanager.projects.create
permission in Google Cloud.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating and managing projects. For reference, the following steps summarize the key points:
- Click → .
- Enter a Project name and select your billing account.
- Select Organization.
- Enter the parent organization in Location.
- Click .
In cost management:
- On the Project page, enter your Project ID.
- Select I am OK with sending the default data set to cost management.
- Click .
1.3. Creating a Google Cloud Identity and Access Management role Copy linkLink copied to clipboard!
A custom Identity and Access Management (IAM) role for cost management gives access to only the cost-related resources that are required for a Google Cloud Platform integration. It does not give access to any non-essential information.
Prerequisites
You must have the following permissions in Google Cloud Console:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:
- In Google Cloud Console, click → .
- Select the project that you created.
- Click .
- Enter a Title, Description and ID for the role.
- Click .
In Enter property name or value, search for and select the following permissions for your custom role:
-
bigquery.jobs.create
-
bigquery.tables.getData
-
bigquery.tables.get
-
bigquery.tables.list
-
- Click and then click .
In cost management:
- In the Add a cloud integration wizard, on the Create IAM role page, click .
1.4. Adding a billing service account member Copy linkLink copied to clipboard!
Create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:
- Click → .
- Select the project that you created.
- Click .
Paste the following principal into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
billing-export@red-hat-cost-management.iam.gserviceaccount.com
Copy to Clipboard Copied! Toggle word wrap Toggle overflow - In the Assign roles section, enter the IAM role that you created.
- Click .
Verification steps
- Navigate to → .
- Verify that the new member is present with the correct role.
In cost management:
On the Assign access page, click .
1.5. Creating a BigQuery dataset Copy linkLink copied to clipboard!
Create a BigQuery dataset to store your billing data.
Prerequisites
You must have the bigquery.datasets.create
permission.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Create a BigQuery dataset. For reference, the following steps summarize the key points:
- Click .
-
In the Explorer panel click the more options menu
next to your project name and click Create dataset.
- Name your dataset.
- Click Create.
In cost management:
- In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset that you created.
- Click .
1.6. Exporting billing data to BigQuery Copy linkLink copied to clipboard!
Configure GCP to send cost and usage billing data automatically to the BigQuery dataset that you created.
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Export Cloud Billing data to BigQuery. For reference, the following steps summarize the key points:
- Click → .
- Click Detailed usage cost section. in the
- Select the cost management Project and Billing export dataset that you created.
- Click .
Verification steps
In the Detailed usage cost section of Google Cloud Console, verify that there is an Enabled checkmark next to the correct Project name and Dataset name.
In cost management:
- In the Add a cloud integration wizard, on the Billing export page, click .
- On the Review details page, review the information about your integration and click .
1.6.1. Viewing billing tables in BigQuery Copy linkLink copied to clipboard!
You may want to review the metrics collected and sent to cost management. This can also assist with troubleshooting incorrect or missing data in cost management.
Google may take several hours to export billing data to your BigQuery dataset.
Prerequisites
-
Access to Google Cloud console with
bigquery.dataViewer
role
Procedure
- Navigate to Google Cloud Console. → in
- Select the cost management project in the Explorer panel.
-
Click
gcp_billing_export_v1_xxxxxx_xxxxxx_xxxxxx
table under the cost management dataset. - Click the Preview tab to view the metrics.
Chapter 2. Creating a filtered Google Cloud integration Copy linkLink copied to clipboard!
- If you created an unfiltered GCP integration, do not complete the following steps. Your GCP integration is already complete.
- GCP is a third-party product and its processes can change. The instructions for configuring third-party integrations are correct at the time of publishing. For the most up-to-date information, see Google’s documentation.
To share a subset of your billing data with Red Hat, you can configure a function script in Google Cloud (GCP) to filter your billing data, store it in object storage, and send the CSV file names to cost management for downloading.
If you create a filtered integration and manually customize your data, you must manually send CSV files to cost management. If you fail to send the CSV files, cost management will not be able to provide you with any insights or cost data. If you want cost management to automatically pull and process reports, do not select I wish to manually customize the data set sent to cost management in the wizard.
Prerequisites
You must have a Red Hat account with Cloud Administrator permissions before you can add integrations to cost management.
- If you created an unfiltered Google Cloud integration, do not complete the following steps. Your integration is already complete.
- Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.
2.1. Selecting Google Cloud as your integration provider Copy linkLink copied to clipboard!
After you add a Google Cloud integration, send your filtered CSV data to cost management for processing.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
-
From Red Hat Hybrid Cloud Console, click Settings Menu
> Integrations.
- On the Settings page, in the Cloud tab, click .
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click .
- Enter a name for your integration. Click .
- In the Select application step, select Cost management and click .
2.2. Creating a Google Cloud project Copy linkLink copied to clipboard!
Create a Google Cloud project to collect and store your cost reports for Red Hat to consume.
Prerequisites
-
You must have the
resourcemanager.projects.create
permission in Google Cloud.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating and managing projects. For reference, the following steps summarize the key points:
- Click → .
- Enter a Project name and select your billing account.
- Select Organization.
- Enter the parent organization in Location.
- Click .
In cost management:
- On the Project page, enter your Project ID.
- To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management.
- Click .
2.3. Creating a Google Cloud bucket Copy linkLink copied to clipboard!
Create a bucket for the filtered reports that you will create later. Buckets are containers that store data.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating buckets. For reference, the following steps summarize the key points:
- Go to → .
- Click .
- Name your bucket and enter any other information.
- Click , then click .
In cost management:
- On the Create cloud storage bucket page, enter your Cloud storage bucket name.
2.4. Creating a Google Cloud Identity and Access Management role Copy linkLink copied to clipboard!
A custom Identity and Access Management (IAM) role for cost management gives access to only the cost-related resources that are required for a Google Cloud Platform integration. It does not give access to any non-essential information.
Prerequisites
You must have the following permissions in Google Cloud Console:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:
- In Google Cloud Console, click → .
- Select the project that you created.
- Click .
- Enter a Title, Description and ID for the role.
- Click .
In Enter property name or value, search for and select the following permissions for your custom role:
-
storage.objects.get
-
storage.objects.list
-
storage.buckets.get
-
- Click and then click .
In cost management:
- In the Add a cloud integration wizard, on the Create IAM role page, click .
2.5. Adding a billing service account member Copy linkLink copied to clipboard!
Create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:
- Click → .
- Select the project that you created.
- Click .
Paste the following principal into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
billing-export@red-hat-cost-management.iam.gserviceaccount.com
Copy to Clipboard Copied! Toggle word wrap Toggle overflow - In the Assign roles section, enter the IAM role that you created.
- Click .
Verification steps
- Navigate to → .
- Verify that the new member is present with the correct role.
In cost management:
On the Assign access page, click .
2.6. Creating a BigQuery dataset Copy linkLink copied to clipboard!
Create a BigQuery dataset to store your billing data.
Prerequisites
You must have the bigquery.datasets.create
permission.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Create a BigQuery dataset. For reference, the following steps summarize the key points:
- Click .
-
In the Explorer panel click the more options menu
next to your project name and click Create dataset.
- Name your dataset.
- Click Create.
2.7. Exporting billing data to BigQuery Copy linkLink copied to clipboard!
Configure GCP to send cost and usage billing data automatically to the BigQuery dataset that you created.
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Export Cloud Billing data to BigQuery. For reference, the following steps summarize the key points:
- Click → .
- Click Detailed usage cost section. in the
- Select the cost management Project and Billing export dataset that you created.
- Click .
Verification steps
In the Detailed usage cost section of Google Cloud Console, verify that there is an Enabled checkmark next to the correct Project name and Dataset name.
In cost management:
- In the Add a cloud integration wizard, on the Billing export page, click .
- On the Review details page, review the information about your integration and click .
-
Copy your
source_uuid
so that you can use it when you send API requests to cost management.
2.8. Building a query with the required columns Copy linkLink copied to clipboard!
Build a custom query to collect your cost data in a CSV file that you can send to cost management. Name your table in the format of `project.dataset.table_name` and include the backticks. To ensure that cost management can process your CSV file, you must include the following columns:
Example 2.1. Billing and Service columns:
-
billing_account_id
-
service.id
-
service.description
-
sku.id
-
sku.description
Example 2.2. Project columns:
-
project.id
-
project.name
-
project.ancestry_numbers
Example 2.3. Usage columns:
-
usage_start_time
-
usage_end_time
-
usage.amounts
-
usage.unit
-
usage.amount_in_pricing_units
-
usage.pricing_unit
Example 2.4. Location columns:
-
location.location
-
location.country
-
location.region
-
location.zone
Example 2.5. Cost columns:
-
cost
-
currency
-
currency_conversion_rate
-
credits
-
cost_type
Example 2.6. Resource columns:
-
resource.name
-
resource.global_name
Example 2.7. Additional datetime columns:
-
partition_date
-
export_time
You can also include the following optional columns for tag-based cost:
-
project.labels
-
labels
-
system_labels
2.8.1. Example query and customization Copy linkLink copied to clipboard!
When you build a query, you should customize it to best fit the needs of your organization. The following example can help guide you, but you should adapt it as necessary for your environment.
The following example query selects all required and optional columns. It also includes a WHERE
clause to limit the amount of data that is queried to the date 2025-04-01. Some columns, like `tags`, are formatted as json strings. Name your table in the format `project.dataset.table_name` with backticks so that you can escape any invalid characters:
SELECT billing_account_id,service.id,service.description,sku.id,sku.description,usage_start_time,usage_end_time,project.id,project.name,TO_JSON_STRING(project.labels),project.ancestry_numbers,TO_JSON_STRING(labels),TO_JSON_STRING(system_labels),location.location,location.country,location.region,location.zone,export_time,cost,currency,currency_conversion_rate,usage.amount,usage.unit,usage.amount_in_pricing_units,usage.pricing_unit,TO_JSON_STRING(credits),invoice.month,cost_type,resource.name,resource.global_name,DATE(_PARTITIONTIME) as partition_date FROM `my-project.my-dataset.my-table` WHERE TIMESTAMP_TRUNC(_PARTITIONTIME, DAY) = TIMESTAMP('2025-04-01')
SELECT
billing_account_id,service.id,service.description,sku.id,sku.description,usage_start_time,usage_end_time,project.id,project.name,TO_JSON_STRING(project.labels),project.ancestry_numbers,TO_JSON_STRING(labels),TO_JSON_STRING(system_labels),location.location,location.country,location.region,location.zone,export_time,cost,currency,currency_conversion_rate,usage.amount,usage.unit,usage.amount_in_pricing_units,usage.pricing_unit,TO_JSON_STRING(credits),invoice.month,cost_type,resource.name,resource.global_name,DATE(_PARTITIONTIME) as partition_date
FROM `my-project.my-dataset.my-table`
WHERE TIMESTAMP_TRUNC(_PARTITIONTIME, DAY) = TIMESTAMP('2025-04-01')
If the example query is not sufficient, you can customize your filtering further with some of the following strategies:
-
Use
WHERE
clauses to filter out specific data. For example,WHERE service.description LIKE '%Red Hat%'
filters out all data that does not have a description containing "Red Hat". -
Use the conjunction and disjunction operators
AND
andOR
to further specify your parameters. - Use the column service.description to filter services like BigQuery or Cloud Logging.
- Use the columns project.id, project.number, and project.name to filter based on your specific project data.
Use location.region to filter data to a specific region.
- Test and Preview your data in BigQuery to ensure you are capturing the correct information before sending it to cost management.
For more information about creating and running queries, see Google’s documentation Create and use tables.
2.9. Exporting CSV data from BiqQuery Copy linkLink copied to clipboard!
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Introduction to data export and Export table data to Cloud Storage. You can also switch coding languages in Google’s table. For reference, the following steps summarize the key points:
- Navigate to BigQuery
- Select your dataset and the table that you created.
- Click Query to test your custom query.
- Save your query.
- Export the query result as a CSV file.
- Store the exported CSV in the bucket that you created.
2.10. Sending a CSV file to cost management Copy linkLink copied to clipboard!
To send your CSV file to cost management, request a service account token for API authentication.
-
In the following curl command, replace
my_client_id
andmy_client_secret
with your actual values:
curl --location 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token' -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=my_client_id' --data-urlencode 'client_secret=my_client_secret' --data-urlencode 'grant_type=client_credentials'
curl --location 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token' -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=my_client_id' --data-urlencode 'client_secret=my_client_secret' --data-urlencode 'grant_type=client_credentials'
- Example response:
{"access_token":"ENCRYPTED_TOKEN","expires_in":900,"refresh_expires_in":0,"token_type":"Bearer","not-before-policy":0,"scope":""}
{"access_token":"ENCRYPTED_TOKEN","expires_in":900,"refresh_expires_in":0,"token_type":"Bearer","not-before-policy":0,"scope":""}
-
Send the API request to cost management and indicate which CSV reports are ready to be processed. The following is an example request. In your request, update
ENCRYPTED_TOKEN
with your token,INTEGRATION_ID
with your C.R.C integration ID,FILES_LIST
with your list of CSV files, andYEAR
andMONTH
with the date of the files.
curl -X POST --location https://console.redhat.com/api/cost-management/v1/ingress/reports/' -H 'Authorization: Bearer ENCRYPTED_TOKEN' -H 'Content-Type: application/json' -d '{"source": "my-integration-id", "bill_year": "2025", "bill_month": "03", "reports_list": ["my-file-1.csv", "my-file-2.csv"]}'
curl -X POST --location https://console.redhat.com/api/cost-management/v1/ingress/reports/' -H 'Authorization: Bearer ENCRYPTED_TOKEN' -H 'Content-Type: application/json' -d '{"source": "my-integration-id", "bill_year": "2025", "bill_month": "03", "reports_list": ["my-file-1.csv", "my-file-2.csv"]}'
- Example response:
{'meta': {'count': 8, 'limit': 10, 'offset': 0}, 'links': {'first': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0', 'next': None, 'previous': None, 'last': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0'}, 'data': {'source': 'source_uuid', 'source_id': source_id, 'reports_list': ['my-csv-file.csv'], 'bill_year': '2025', 'bill_month': '03', 'schema_name': 'my-schema', 'ingress_report_uuid': 'report-uuid', 'status': 'pending'}}
{'meta': {'count': 8, 'limit': 10, 'offset': 0}, 'links': {'first': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0', 'next': None, 'previous': None, 'last': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0'}, 'data': {'source': 'source_uuid', 'source_id': source_id, 'reports_list': ['my-csv-file.csv'], 'bill_year': '2025', 'bill_month': '03', 'schema_name': 'my-schema', 'ingress_report_uuid': 'report-uuid', 'status': 'pending'}}
After you send your CSV file, you have successfully integrated with GCP. If you want to automate the process, see Section 3.1, “Using example code snippets to automatically create and send reports” for examples.
Chapter 3. Resources for filtered integrations Copy linkLink copied to clipboard!
3.1. Using example code snippets to automatically create and send reports Copy linkLink copied to clipboard!
After you create a filtered Google Cloud integration, you can automate creating and sending reports. The main tasks involve querying your data, formatting and exporting a CSV file, and sending the data to cost management. The following examples provide code snippets to guide you, but you should adapt the process to reflect your unique environments. If you follow the documentation exactly as written without customizations, the automation might not work for your specific setup.
The following code writes CSV files to your GCP bucket from the query data:
The following code adds batching to restrict the CSV file size:
The following code authenticates and sends reports to cost management for processing by fetching and using your service account access token:
3.1.1. Additional resources Copy linkLink copied to clipboard!
For reference only, this python script provides additional logic such as restricting CSV file sizes and using variables for client secrets.
For additional help with automation, see Google’s documentation:
3.2. Troubleshooting a GCP integration Copy linkLink copied to clipboard!
3.2.1. Incorrect data is displayed Copy linkLink copied to clipboard!
If your data is displaying incorrectly in cost management, first download your CSV files and validate the data that you sent. If that does not solve the issue, review the following common scenarios:
- If you upload data for the same day multiple times, the most recent upload is what is displayed. Send the data again in the correct order.
- GCP uses a method called crossover data. For each day, GCP continues to add billing data 72 hours after it had accrued. To ensure that you are capturing the correct billing data, consider a rolling window to capture these additions. For example, you can maintain a five day rolling window, or never query current day data and instead always query n-5.
- At the start of a new month, GCP finishes billing for the previous month. To ensure that there are no gaps in your billing data, send the data for the previous month up to three days into the following month.
-
If your custom table does not have partitions by day (
_PARTITIONTIME
), useusage_start_time
as the partition date.
Part II. Updating an integration Copy linkLink copied to clipboard!
You can pause, resume, or remove your integrations in Red Hat Hybrid Cloud Console.
Chapter 4. Editing an integration Copy linkLink copied to clipboard!
-
From Red Hat Hybrid Cloud Console, click
.
- Click .
-
Find the integration that you want to edit and click the more options menu
.
- Select whether you want to Pause, Resume, or Remove your integration.
Part III. Configuring and viewing your data Copy linkLink copied to clipboard!
After adding your OpenShift Container Platform and Google Cloud integration, on the cost management Overview page, your cost data is sorted into OpenShift and Infrastructure tabs. Select Perspective to toggle through different views of your cost data.
You can also use the global navigation menu to view additional details about your costs by cloud provider.
To add other types of integrations, see:
Chapter 5. Limiting access to cost management resources Copy linkLink copied to clipboard!
After you add and configure integrations in cost management, you can limit access to cost data and resources.
You might not want users to have access to all of your cost data. Instead, you can grant users access only to data that is specific to their projects or organizations. With role-based access control, you can limit the visibility of resources in cost management reports. For example, you can restrict a user’s view to only AWS integrations, rather than the entire environment.
To learn how to limit access, see the more in-depth guide Limiting access to cost management resources.
Chapter 6. Configuring tagging for your integrations Copy linkLink copied to clipboard!
The cost management application tracks cloud and infrastructure costs with tags. Tags are also known as labels in OpenShift.
You can refine tags in cost management to filter and attribute resources, organize your resources by cost, and allocate costs to different parts of your cloud infrastructure.
You can only configure tags and labels directly on an integration. You can choose the tags that you activate in cost management, however, you cannot edit tags and labels in the cost management application.
To learn more about the following topics, see Managing cost data using tagging:
- Planning your tagging strategy to organize your view of cost data
- Understanding how cost management associates tags
- Configuring tags and labels on your integrations
Chapter 7. Configuring cost models to accurately report costs Copy linkLink copied to clipboard!
Now that you configured your integrations to collect cost and usage data in cost management, you can configure cost models to associate prices to metrics and usage.
A cost model is a framework that uses raw costs and metrics to define calculations for the costs in cost management. You can record, categorize, and distribute the costs that the cost model generates to specific customers, business units, or projects.
In Cost Models, you can complete the following tasks:
- Classifying your costs as infrastructure or supplementary costs
- Capturing monthly costs for OpenShift nodes and clusters
- Applying a markup to account for additional support costs
To learn how to configure a cost model, see Using cost models.
Chapter 8. Visualizing your costs with Cost Explorer Copy linkLink copied to clipboard!
Use cost management Cost Explorer to create custom graphs of time-scaled cost and usage information and ultimately better visualize and interpret your costs.
To learn more about the following topics, see Visualizing your costs using Cost Explorer:
- Using Cost Explorer to identify abnormal events
- Understanding how your cost data changes over time
- Creating custom bar charts of your cost and usage data
- Exporting custom cost data tables
Providing feedback on Red Hat documentation Copy linkLink copied to clipboard!
We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.
Prerequisites
- You are logged in to the Red Hat Customer Portal.
Procedure
To provide feedback, perform the following steps:
- Click the following link: Create Issue.
- Describe the issue or enhancement in the Summary text box.
- Provide details about the issue or requested enhancement in the Description text box.
- Type your name in the Reporter text box.
- Click the Create button.
This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.