Integrating Google Cloud data into cost management


Cost Management Service 1-latest

Learn how to add and configure your Google Cloud integration

Red Hat Customer Content Services

Abstract

Learn how to add a Google Cloud integration to cost management. Cost management is part of the Red Hat Insights portfolio of services. The Red Hat Insights suite of advanced analytical tools helps you to identify and prioritize impacts on your operations, security, and business.

Before you create an Google Cloud (GCP) integration, first decide if you want to create a filtered or unfiltered integration.

Unfiltered flow

An unfiltered integration enables cost management to directly read your billing reports from GCP. You can select the scope of the reports later on.

To create an unfiltered integration, go to Creating an unfiltered GCP integration.

Filtered flow

A filtered integration enables you to customize and filter your data before cost management reads it. Some customers use the filtered integration to share billing data with only certain Red Hat products. It is more complex to set up and configure a filtered integration than an unfiltered one.

Warning

If you create a filtered integration and manually customize your data, you must manually send CSV files to cost management. If you fail to send the CSV files, cost management will not be able to provide you with any insights or cost data. If you want cost management to automatically pull and process reports,, do not select I wish to manually customize the data set sent to cost management in the wizard.

To create a filtered integration, go to Creating a filtered GCP integration.

Note

You must create either a filtered or unfiltered integration. Do not follow both sets of instructions.

In the Integrations page, you can create a GCP cloud integration and configure your GCP account to give cost management access

Note
  • If you want to create a filtered GCP integration, do not complete the following steps. Instead, go to Creating a filtered Google Cloud integration.
  • Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

Prerequisites

You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to cost management.

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

1.2. Creating a Google Cloud project

Create a Google Cloud project to collect and store your cost reports for Red Hat to consume.

Prerequisites

  • You must have the resourcemanager.projects.create permission in Google Cloud.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating and managing projects. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click IAM & AdminCreate a Project.
  2. Enter a Project name and select your billing account.
  3. Select Organization.
  4. Enter the parent organization in Location.
  5. Click Create.

In cost management:

  1. On the Project page, enter your Project ID.
  2. Select I am OK with sending the default data set to cost management.
  3. Click Next.

A custom Identity and Access Management (IAM) role for cost management gives access to only the cost-related resources that are required for a Google Cloud Platform integration. It does not give access to any non-essential information.

Prerequisites

You must have the following permissions in Google Cloud Console:

  • resourcemanager.projects.get
  • resourcemanager.projects.getIamPolicy
  • resourcemanager.projects.setIamPolicy

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:

  1. In Google Cloud Console, click IAM & AdminRoles.
  2. Select the project that you created.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role.
  5. Click + ADD PERMISSIONS.
  6. In Enter property name or value, search for and select the following permissions for your custom role:

    • bigquery.jobs.create
    • bigquery.tables.getData
    • bigquery.tables.get
    • bigquery.tables.list
  7. Click ADD and then click CREATE.

In cost management:

  1. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

1.4. Adding a billing service account member

Create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click IAM & AdminIAM.
  2. Select the project that you created.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
    Copy to Clipboard Toggle word wrap
  5. In the Assign roles section, enter the IAM role that you created.
  6. Click SAVE.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify that the new member is present with the correct role.

In cost management:

On the Assign access page, click Next.

1.5. Creating a BigQuery dataset

Create a BigQuery dataset to store your billing data.

Prerequisites

You must have the bigquery.datasets.create permission.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Create a BigQuery dataset. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click BigQuery.
  2. In the Explorer panel click the more options menu more options next to your project name and click Create dataset.
  3. Name your dataset.
  4. Click Create.

In cost management:

  1. In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset that you created.
  2. Click Next.

1.6. Exporting billing data to BigQuery

Configure GCP to send cost and usage billing data automatically to the BigQuery dataset that you created.

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Export Cloud Billing data to BigQuery. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click BillingBilling export.
  2. Click EDIT SETTINGS in the Detailed usage cost section.
  3. Select the cost management Project and Billing export dataset that you created.
  4. Click SAVE.

Verification steps

In the Detailed usage cost section of Google Cloud Console, verify that there is an Enabled checkmark next to the correct Project name and Dataset name.

In cost management:

  1. In the Add a cloud integration wizard, on the Billing export page, click Next.
  2. On the Review details page, review the information about your integration and click Add.

1.6.1. Viewing billing tables in BigQuery

You may want to review the metrics collected and sent to cost management. This can also assist with troubleshooting incorrect or missing data in cost management.

Note

Google may take several hours to export billing data to your BigQuery dataset.

Prerequisites

  • Access to Google Cloud console with bigquery.dataViewer role

Procedure

  1. Navigate to Big DataBigQuery in Google Cloud Console.
  2. Select the cost management project in the Explorer panel.
  3. Click gcp_billing_export_v1_xxxxxx_xxxxxx_xxxxxx table under the cost management dataset.
  4. Click the Preview tab to view the metrics.
Note
  • If you created an unfiltered GCP integration, do not complete the following steps. Your GCP integration is already complete.
  • GCP is a third-party product and its processes can change. The instructions for configuring third-party integrations are correct at the time of publishing. For the most up-to-date information, see Google’s documentation.

To share a subset of your billing data with Red Hat, you can configure a function script in Google Cloud (GCP) to filter your billing data, store it in object storage, and send the CSV file names to cost management for downloading.

Warning

If you create a filtered integration and manually customize your data, you must manually send CSV files to cost management. If you fail to send the CSV files, cost management will not be able to provide you with any insights or cost data. If you want cost management to automatically pull and process reports, do not select I wish to manually customize the data set sent to cost management in the wizard.

Prerequisites

You must have a Red Hat account with Cloud Administrator permissions before you can add integrations to cost management.

Note
  • If you created an unfiltered Google Cloud integration, do not complete the following steps. Your integration is already complete.
  • Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

After you add a Google Cloud integration, send your filtered CSV data to cost management for processing.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

2.2. Creating a Google Cloud project

Create a Google Cloud project to collect and store your cost reports for Red Hat to consume.

Prerequisites

  • You must have the resourcemanager.projects.create permission in Google Cloud.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating and managing projects. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click IAM & AdminCreate a Project.
  2. Enter a Project name and select your billing account.
  3. Select Organization.
  4. Enter the parent organization in Location.
  5. Click Create.

In cost management:

  1. On the Project page, enter your Project ID.
  2. To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management.
  3. Click Next.

2.3. Creating a Google Cloud bucket

Create a bucket for the filtered reports that you will create later. Buckets are containers that store data.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating buckets. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Go to Cloud StorageBuckets.
  2. Click Create.
  3. Name your bucket and enter any other information.
  4. Click Create, then click Confirm.

In cost management:

  1. On the Create cloud storage bucket page, enter your Cloud storage bucket name.

A custom Identity and Access Management (IAM) role for cost management gives access to only the cost-related resources that are required for a Google Cloud Platform integration. It does not give access to any non-essential information.

Prerequisites

You must have the following permissions in Google Cloud Console:

  • resourcemanager.projects.get
  • resourcemanager.projects.getIamPolicy
  • resourcemanager.projects.setIamPolicy

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:

  1. In Google Cloud Console, click IAM & AdminRoles.
  2. Select the project that you created.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role.
  5. Click + ADD PERMISSIONS.
  6. In Enter property name or value, search for and select the following permissions for your custom role:

    • storage.objects.get
    • storage.objects.list
    • storage.buckets.get
  7. Click ADD and then click CREATE.

In cost management:

  1. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

2.5. Adding a billing service account member

Create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click IAM & AdminIAM.
  2. Select the project that you created.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
    Copy to Clipboard Toggle word wrap
  5. In the Assign roles section, enter the IAM role that you created.
  6. Click SAVE.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify that the new member is present with the correct role.

In cost management:

On the Assign access page, click Next.

2.6. Creating a BigQuery dataset

Create a BigQuery dataset to store your billing data.

Prerequisites

You must have the bigquery.datasets.create permission.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Create a BigQuery dataset. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click BigQuery.
  2. In the Explorer panel click the more options menu more options next to your project name and click Create dataset.
  3. Name your dataset.
  4. Click Create.

2.7. Exporting billing data to BigQuery

Configure GCP to send cost and usage billing data automatically to the BigQuery dataset that you created.

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Export Cloud Billing data to BigQuery. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click BillingBilling export.
  2. Click EDIT SETTINGS in the Detailed usage cost section.
  3. Select the cost management Project and Billing export dataset that you created.
  4. Click SAVE.

Verification steps

In the Detailed usage cost section of Google Cloud Console, verify that there is an Enabled checkmark next to the correct Project name and Dataset name.

In cost management:

  1. In the Add a cloud integration wizard, on the Billing export page, click Next.
  2. On the Review details page, review the information about your integration and click Add.
  3. Copy your source_uuid so that you can use it when you send API requests to cost management.

2.8. Building a query with the required columns

Build a custom query to collect your cost data in a CSV file that you can send to cost management. Name your table in the format of `project.dataset.table_name` and include the backticks. To ensure that cost management can process your CSV file, you must include the following columns:

Example 2.1. Billing and Service columns:

  • billing_account_id
  • service.id
  • service.description
  • sku.id
  • sku.description

Example 2.2. Project columns:

  • project.id
  • project.name
  • project.ancestry_numbers

Example 2.3. Usage columns:

  • usage_start_time
  • usage_end_time
  • usage.amounts
  • usage.unit
  • usage.amount_in_pricing_units
  • usage.pricing_unit

Example 2.4. Location columns:

  • location.location
  • location.country
  • location.region
  • location.zone

Example 2.5. Cost columns:

  • cost
  • currency
  • currency_conversion_rate
  • credits
  • cost_type

Example 2.6. Resource columns:

  • resource.name
  • resource.global_name

Example 2.7. Additional datetime columns:

  • partition_date
  • export_time

You can also include the following optional columns for tag-based cost:

  • project.labels
  • labels
  • system_labels

2.8.1. Example query and customization

Important

When you build a query, you should customize it to best fit the needs of your organization. The following example can help guide you, but you should adapt it as necessary for your environment.

The following example query selects all required and optional columns. It also includes a WHERE clause to limit the amount of data that is queried to the date 2025-04-01. Some columns, like `tags`, are formatted as json strings. Name your table in the format `project.dataset.table_name` with backticks so that you can escape any invalid characters:

           SELECT
           billing_account_id,service.id,service.description,sku.id,sku.description,usage_start_time,usage_end_time,project.id,project.name,TO_JSON_STRING(project.labels),project.ancestry_numbers,TO_JSON_STRING(labels),TO_JSON_STRING(system_labels),location.location,location.country,location.region,location.zone,export_time,cost,currency,currency_conversion_rate,usage.amount,usage.unit,usage.amount_in_pricing_units,usage.pricing_unit,TO_JSON_STRING(credits),invoice.month,cost_type,resource.name,resource.global_name,DATE(_PARTITIONTIME) as partition_date
           FROM `my-project.my-dataset.my-table`
           WHERE TIMESTAMP_TRUNC(_PARTITIONTIME, DAY) = TIMESTAMP('2025-04-01')
Copy to Clipboard Toggle word wrap

If the example query is not sufficient, you can customize your filtering further with some of the following strategies:

  • Use WHERE clauses to filter out specific data. For example, WHERE service.description LIKE '%Red Hat%' filters out all data that does not have a description containing "Red Hat".
  • Use the conjunction and disjunction operators AND and OR to further specify your parameters.
  • Use the column service.description to filter services like BigQuery or Cloud Logging.
  • Use the columns project.id, project.number, and project.name to filter based on your specific project data.
  • Use location.region to filter data to a specific region.

    1. Test and Preview your data in BigQuery to ensure you are capturing the correct information before sending it to cost management.

For more information about creating and running queries, see Google’s documentation Create and use tables.

2.9. Exporting CSV data from BiqQuery

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Introduction to data export and Export table data to Cloud Storage. You can also switch coding languages in Google’s table. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Navigate to BigQuery
  2. Select your dataset and the table that you created.
  3. Click Query to test your custom query.
  4. Save your query.
  5. Export the query result as a CSV file.
  6. Store the exported CSV in the bucket that you created.

2.10. Sending a CSV file to cost management

To send your CSV file to cost management, request a service account token for API authentication.

  1. In the following curl command, replace my_client_id and my_client_secret with your actual values:
curl --location 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token' -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=my_client_id' --data-urlencode 'client_secret=my_client_secret' --data-urlencode 'grant_type=client_credentials'
Copy to Clipboard Toggle word wrap
  • Example response:
{"access_token":"ENCRYPTED_TOKEN","expires_in":900,"refresh_expires_in":0,"token_type":"Bearer","not-before-policy":0,"scope":""}
Copy to Clipboard Toggle word wrap
  1. Send the API request to cost management and indicate which CSV reports are ready to be processed. The following is an example request. In your request, update ENCRYPTED_TOKEN with your token, INTEGRATION_ID with your C.R.C integration ID, FILES_LIST with your list of CSV files, and YEAR and MONTH with the date of the files.
curl -X POST --location https://console.redhat.com/api/cost-management/v1/ingress/reports/' -H 'Authorization: Bearer ENCRYPTED_TOKEN' -H 'Content-Type: application/json' -d '{"source": "my-integration-id", "bill_year": "2025", "bill_month": "03", "reports_list": ["my-file-1.csv", "my-file-2.csv"]}'
Copy to Clipboard Toggle word wrap
  • Example response:
	{'meta': {'count': 8, 'limit': 10, 'offset': 0}, 'links': {'first': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0', 'next': None, 'previous': None, 'last': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0'}, 'data': {'source': 'source_uuid', 'source_id': source_id, 'reports_list': ['my-csv-file.csv'], 'bill_year': '2025', 'bill_month': '03', 'schema_name': 'my-schema', 'ingress_report_uuid': 'report-uuid', 'status': 'pending'}}
Copy to Clipboard Toggle word wrap

After you send your CSV file, you have successfully integrated with GCP. If you want to automate the process, see Section 3.1, “Using example code snippets to automatically create and send reports” for examples.

Chapter 3. Resources for filtered integrations

Important

After you create a filtered Google Cloud integration, you can automate creating and sending reports. The main tasks involve querying your data, formatting and exporting a CSV file, and sending the data to cost management. The following examples provide code snippets to guide you, but you should adapt the process to reflect your unique environments. If you follow the documentation exactly as written without customizations, the automation might not work for your specific setup.

The following code writes CSV files to your GCP bucket from the query data:

import csv
from google.cloud import storage

storage_client = storage.Client()
bucket = storage_client.bucket(“my_bucket”)
for rows in query_job:
    csv_file = "my_report_location/2025-04-01.csv"
    blob = bucket.blob(csv_file)
    with blob.open(mode='w') as f:
        writer = csv.writer(f)
        writer.writerow([my_col_1, my_col_col2])
        writer.writerows(rows)
Copy to Clipboard Toggle word wrap

The following code adds batching to restrict the CSV file size:

def batch(iterable, n):
"""Yields successive n-sized chunks from iterable"""
	it = iter(iterable)
	while chunk := tuple(islice(it, n)):
    	yield chunk

for i, rows in enumerate(batch(query_job, 200000)):
    # write csv file code block #
Copy to Clipboard Toggle word wrap

The following code authenticates and sends reports to cost management for processing by fetching and using your service account access token:

import os
import requests

# os.getenv(var) used to fetch GCP stored secrets shared with your function
CLIENT_ID = os.getenv('client_id')
CLIENT_SECRET = os.getenv('client_secret')

# Get access token
token_url = 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token'
token_headers = {'Content-Type': 'application/x-www-form-urlencoded'}
token_data = {"client_id": CLIENT_ID, "client_secret": CLIENT_SECRET, "grant_type": "client_credentials"}
access_token = requests.post(token_url, headers=token_headers, data=token_data).json().get("access_token")

# Send reports to Red Hat
json_data = {"source": 0, "reports_list": ["my-file1.csv", "my-file-2.csv"], "bill_year": "2025", "bill_month": "07"}
headers = {'Authorization': f'Bearer {access_token}', 'Accept': 'application/json'}
requests.post("https://console.redhat.com/api/cost-management/v1/ingress/reports/", json=json_data, headers=headers)
Copy to Clipboard Toggle word wrap

3.1.1. Additional resources

For reference only, this python script provides additional logic such as restricting CSV file sizes and using variables for client secrets.

For additional help with automation, see Google’s documentation:

3.2. Troubleshooting a GCP integration

3.2.1. Incorrect data is displayed

If your data is displaying incorrectly in cost management, first download your CSV files and validate the data that you sent. If that does not solve the issue, review the following common scenarios:

  • If you upload data for the same day multiple times, the most recent upload is what is displayed. Send the data again in the correct order.
  • GCP uses a method called crossover data. For each day, GCP continues to add billing data 72 hours after it had accrued. To ensure that you are capturing the correct billing data, consider a rolling window to capture these additions. For example, you can maintain a five day rolling window, or never query current day data and instead always query n-5.
  • At the start of a new month, GCP finishes billing for the previous month. To ensure that there are no gaps in your billing data, send the data for the previous month up to three days into the following month.
  • If your custom table does not have partitions by day (_PARTITIONTIME), use usage_start_time as the partition date.

Part II. Updating an integration

You can pause, resume, or remove your integrations in Red Hat Hybrid Cloud Console.

Chapter 4. Editing an integration

  1. From Red Hat Hybrid Cloud Console, click Settings Settings icon .
  2. Click Integrations.
  3. Find the integration that you want to edit and click the more options menu more options .
  4. Select whether you want to Pause, Resume, or Remove your integration.

Part III. Configuring and viewing your data

After adding your OpenShift Container Platform and Google Cloud integration, on the cost management Overview page, your cost data is sorted into OpenShift and Infrastructure tabs. Select Perspective to toggle through different views of your cost data.

You can also use the global navigation menu to view additional details about your costs by cloud provider.

To add other types of integrations, see:

After you add and configure integrations in cost management, you can limit access to cost data and resources.

You might not want users to have access to all of your cost data. Instead, you can grant users access only to data that is specific to their projects or organizations. With role-based access control, you can limit the visibility of resources in cost management reports. For example, you can restrict a user’s view to only AWS integrations, rather than the entire environment.

To learn how to limit access, see the more in-depth guide Limiting access to cost management resources.

The cost management application tracks cloud and infrastructure costs with tags. Tags are also known as labels in OpenShift.

You can refine tags in cost management to filter and attribute resources, organize your resources by cost, and allocate costs to different parts of your cloud infrastructure.

Important

You can only configure tags and labels directly on an integration. You can choose the tags that you activate in cost management, however, you cannot edit tags and labels in the cost management application.

To learn more about the following topics, see Managing cost data using tagging:

  • Planning your tagging strategy to organize your view of cost data
  • Understanding how cost management associates tags
  • Configuring tags and labels on your integrations

Now that you configured your integrations to collect cost and usage data in cost management, you can configure cost models to associate prices to metrics and usage.

A cost model is a framework that uses raw costs and metrics to define calculations for the costs in cost management. You can record, categorize, and distribute the costs that the cost model generates to specific customers, business units, or projects.

In Cost Models, you can complete the following tasks:

  • Classifying your costs as infrastructure or supplementary costs
  • Capturing monthly costs for OpenShift nodes and clusters
  • Applying a markup to account for additional support costs

To learn how to configure a cost model, see Using cost models.

Use cost management Cost Explorer to create custom graphs of time-scaled cost and usage information and ultimately better visualize and interpret your costs.

To learn more about the following topics, see Visualizing your costs using Cost Explorer:

  • Using Cost Explorer to identify abnormal events
  • Understanding how your cost data changes over time
  • Creating custom bar charts of your cost and usage data
  • Exporting custom cost data tables

Providing feedback on Red Hat documentation

We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.

Prerequisites

  • You are logged in to the Red Hat Customer Portal.

Procedure

To provide feedback, perform the following steps:

  1. Click the following link: Create Issue.
  2. Describe the issue or enhancement in the Summary text box.
  3. Provide details about the issue or requested enhancement in the Description text box.
  4. Type your name in the Reporter text box.
  5. Click the Create button.

This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.

Legal Notice

Copyright © 2025 Red Hat, Inc.
The text of and illustrations in this document are licensed by Red Hat under a Creative Commons Attribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is available at http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute this document or an adaptation of it, you must provide the URL for the original version.
Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert, Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.
Red Hat, Red Hat Enterprise Linux, the Shadowman logo, the Red Hat logo, JBoss, OpenShift, Fedora, the Infinity logo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.
Linux® is the registered trademark of Linus Torvalds in the United States and other countries.
Java® is a registered trademark of Oracle and/or its affiliates.
XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
Node.js® is an official trademark of Joyent. Red Hat is not formally related to or endorsed by the official Joyent Node.js open source or commercial project.
The OpenStack® Word Mark and OpenStack logo are either registered trademarks/service marks or trademarks/service marks of the OpenStack Foundation, in the United States and other countries and are used with the OpenStack Foundation's permission. We are not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.
All other trademarks are the property of their respective owners.
Back to top
Red Hat logoGithubredditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust. Explore our recent updates.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Theme

© 2025 Red Hat