Este conteúdo não está disponível no idioma selecionado.

Chapter 2. Creating a filtered Google Cloud integration


Note
  • If you created an unfiltered Google Cloud integration, do not complete the following steps. Your Google Cloud integration is already complete.
  • Google Cloud is a third-party product and its processes can change. The instructions for configuring third-party integrations are correct at the time of publishing. For the most up-to-date information, see Google’s documentation.

To share a subset of your billing data with Red Hat, you can configure a function script in Google Cloud to filter your billing data, store it in object storage, and send the CSV file names to cost management for downloading.

Warning

If you create a filtered integration and manually customize your data, you must manually send CSV files to cost management. If you fail to send the CSV files, cost management will not be able to provide you with any Red Hat Insights or cost data. If you want cost management to automatically pull and process reports, do not select I wish to manually customize the data set sent to cost management in the wizard.

Prerequisites

You must have a Red Hat account with Cloud Administrator permissions before you can add integrations to cost management.

Note
  • If you created an unfiltered Google Cloud integration, do not complete the following steps. Your integration is already complete.
  • Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud documentation.

2.1. Selecting Google Cloud as your integration provider

After you add a Google Cloud integration, send your filtered CSV data to cost management for processing.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

2.2. Creating a Google Cloud project

Create a Google Cloud project to collect and store your cost reports for Red Hat to consume.

Prerequisites

  • You must have the resourcemanager.projects.create permission in Google Cloud.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating and managing projects. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click IAM & Admin Create a Project.
  2. Enter a Project name and select your billing account.
  3. Select Organization.
  4. Enter the parent organization in Location.
  5. Click Create.

In cost management:

  1. On the Project page, enter your Project ID.
  2. To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management.
  3. Click Next.

2.3. Creating a Google Cloud bucket

Create a bucket for the filtered reports that you will create later. Buckets are containers that store data.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating buckets. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Go to Cloud Storage Buckets.
  2. Click Create.
  3. Name your bucket and enter any other information.
  4. Click Create, then click Confirm.

In cost management:

  1. On the Create cloud storage bucket page, enter your Cloud storage bucket name.

2.4. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for cost management gives access to only the cost-related resources that are required for a Google Cloud integration. It does not give access to any non-essential information.

Prerequisites

You must have the following permissions in Google Cloud Console:

  • resourcemanager.projects.get
  • resourcemanager.projects.getIamPolicy
  • resourcemanager.projects.setIamPolicy

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:

  1. In Google Cloud Console, click IAM & Admin Roles.
  2. Select the project that you created.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role.
  5. Click + ADD PERMISSIONS.
  6. In Enter property name or value, search for and select the following permissions for your custom role:

    • storage.objects.get
    • storage.objects.list
    • storage.buckets.get
  7. Click ADD and then click CREATE.

In cost management:

  1. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

2.5. Adding a billing service account member

Create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click IAM & Admin IAM.
  2. Select the project that you created.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
    Copy to Clipboard Toggle word wrap
  5. In the Assign roles section, enter the IAM role that you created.
  6. Click SAVE.

Verification steps

  1. Navigate to IAM & Admin IAM.
  2. Verify that the new member is present with the correct role.

In cost management:

On the Assign access page, click Next.

2.6. Creating a BigQuery dataset

Create a BigQuery dataset to store your billing data.

Prerequisites

You must have the bigquery.datasets.create permission.

Procedure

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Create a BigQuery dataset. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click BigQuery.
  2. In the Explorer panel click the more options menu more options next to your project name and click Create dataset.
  3. Name your dataset.
  4. Click Create.

2.7. Exporting billing data to BigQuery

Configure Google Cloud to send cost and usage billing data automatically to the BigQuery dataset that you created.

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Export Cloud Billing data to BigQuery. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Click Billing Billing export.
  2. Click EDIT SETTINGS in the Detailed usage cost section.
  3. Select the cost management Project and Billing export dataset that you created.
  4. Click SAVE.

Verification steps

In the Detailed usage cost section of Google Cloud Console, verify that there is an Enabled checkmark next to the correct Project name and Dataset name.

In cost management:

  1. In the Add a cloud integration wizard, on the Billing export page, click Next.
  2. On the Review details page, review the information about your integration and click Add.
  3. Copy your source_uuid so that you can use it when you send API requests to cost management.

2.8. Building a query with the required columns

Build a custom query to collect your cost data in a CSV file that you can send to cost management. Name your table in the format of `project.dataset.table_name` and include the backticks. To ensure that cost management can process your CSV file, you must include the following columns:

Example 2.1. Billing and Service columns:

  • billing_account_id
  • service.id
  • service.description
  • sku.id
  • sku.description

Example 2.2. Project columns:

  • project.id
  • project.name
  • project.ancestry_numbers

Example 2.3. Usage columns:

  • usage_start_time
  • usage_end_time
  • usage.amounts
  • usage.unit
  • usage.amount_in_pricing_units
  • usage.pricing_unit

Example 2.4. Location columns:

  • location.location
  • location.country
  • location.region
  • location.zone

Example 2.5. Cost columns:

  • cost
  • currency
  • currency_conversion_rate
  • credits
  • cost_type

Example 2.6. Resource columns:

  • resource.name
  • resource.global_name

Example 2.7. Additional datetime columns:

  • partition_date
  • export_time

You can also include the following optional columns for tag-based cost:

  • project.labels
  • labels
  • system_labels

2.8.1. Example query and customization

Important

When you build a query, you should customize it to best fit the needs of your organization. The following example can help guide you, but you should adapt it as necessary for your environment.

The following example query selects all required and optional columns. It also includes a WHERE clause to limit the amount of data that is queried to the date 2025-04-01. Some columns, like `tags`, are formatted as json strings. Name your table in the format `project.dataset.table_name` with backticks so that you can escape any invalid characters:

           SELECT
           billing_account_id,service.id,service.description,sku.id,sku.description,usage_start_time,usage_end_time,project.id,project.name,TO_JSON_STRING(project.labels),project.ancestry_numbers,TO_JSON_STRING(labels),TO_JSON_STRING(system_labels),location.location,location.country,location.region,location.zone,export_time,cost,currency,currency_conversion_rate,usage.amount,usage.unit,usage.amount_in_pricing_units,usage.pricing_unit,TO_JSON_STRING(credits),invoice.month,cost_type,resource.name,resource.global_name,DATE(_PARTITIONTIME) as partition_date
           FROM `my-project.my-dataset.my-table`
           WHERE TIMESTAMP_TRUNC(_PARTITIONTIME, DAY) = TIMESTAMP('2025-04-01')
Copy to Clipboard Toggle word wrap

If the example query is not sufficient, you can customize your filtering further with some of the following strategies:

  • Use WHERE clauses to filter out specific data. For example, WHERE service.description LIKE '%Red Hat%' filters out all data that does not have a description containing "Red Hat".
  • Use the conjunction and disjunction operators AND and OR to further specify your parameters.
  • Use the column service.description to filter services like BigQuery or Cloud Logging.
  • Use the columns project.id, project.number, and project.name to filter based on your specific project data.
  • Use location.region to filter data to a specific region.

    1. Test and Preview your data in BigQuery to ensure you are capturing the correct information before sending it to cost management.

For more information about creating and running queries, see Google’s documentation Create and use tables.

2.9. Exporting CSV data from BiqQuery

Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Introduction to data export and Export table data to Cloud Storage. You can also switch coding languages in Google’s table. For reference, the following steps summarize the key points:

In Google Cloud Console:

  1. Navigate to BigQuery
  2. Select your dataset and the table that you created.
  3. Click Query to test your custom query.
  4. Save your query.
  5. Export the query result as a CSV file.
  6. Store the exported CSV in the bucket that you created.

2.10. Sending a CSV file to cost management

To send your CSV file to cost management, request a service account token for API authentication.

  1. In the following curl command, replace my_client_id and my_client_secret with your actual values:
curl --location 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token' -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=my_client_id' --data-urlencode 'client_secret=my_client_secret' --data-urlencode 'grant_type=client_credentials'
Copy to Clipboard Toggle word wrap
  • Example response:
{"access_token":"ENCRYPTED_TOKEN","expires_in":900,"refresh_expires_in":0,"token_type":"Bearer","not-before-policy":0,"scope":""}
Copy to Clipboard Toggle word wrap
  1. Send the API request to cost management and indicate which CSV reports are ready to be processed. The following is an example request. In your request, update ENCRYPTED_TOKEN with your token, INTEGRATION_ID with your C.R.C integration ID, FILES_LIST with your list of CSV files, and YEAR and MONTH with the date of the files.
curl -X POST --location https://console.redhat.com/api/cost-management/v1/ingress/reports/' -H 'Authorization: Bearer ENCRYPTED_TOKEN' -H 'Content-Type: application/json' -d '{"source": "my-integration-id", "bill_year": "2025", "bill_month": "03", "reports_list": ["my-file-1.csv", "my-file-2.csv"]}'
Copy to Clipboard Toggle word wrap
  • Example response:
	{'meta': {'count': 8, 'limit': 10, 'offset': 0}, 'links': {'first': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0', 'next': None, 'previous': None, 'last': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0'}, 'data': {'source': 'source_uuid', 'source_id': source_id, 'reports_list': ['my-csv-file.csv'], 'bill_year': '2025', 'bill_month': '03', 'schema_name': 'my-schema', 'ingress_report_uuid': 'report-uuid', 'status': 'pending'}}
Copy to Clipboard Toggle word wrap

After you send your CSV file, you have successfully integrated with Google Cloud. If you want to automate the process, see Section 3.1, “Using example code snippets to automatically create and send reports” for examples.

Voltar ao topo
Red Hat logoGithubredditYoutubeTwitter

Aprender

Experimente, compre e venda

Comunidades

Sobre a documentação da Red Hat

Ajudamos os usuários da Red Hat a inovar e atingir seus objetivos com nossos produtos e serviços com conteúdo em que podem confiar. Explore nossas atualizações recentes.

Tornando o open source mais inclusivo

A Red Hat está comprometida em substituir a linguagem problemática em nosso código, documentação e propriedades da web. Para mais detalhes veja o Blog da Red Hat.

Sobre a Red Hat

Fornecemos soluções robustas que facilitam o trabalho das empresas em plataformas e ambientes, desde o data center principal até a borda da rede.

Theme

© 2025 Red Hat