Este conteúdo não está disponível no idioma selecionado.
Chapter 2. Creating a filtered Google Cloud integration
- If you created an unfiltered Google Cloud integration, do not complete the following steps. Your Google Cloud integration is already complete.
- Google Cloud is a third-party product and its processes can change. The instructions for configuring third-party integrations are correct at the time of publishing. For the most up-to-date information, see Google’s documentation.
To share a subset of your billing data with Red Hat, you can configure a function script in Google Cloud to filter your billing data, store it in object storage, and send the CSV file names to cost management for downloading.
If you create a filtered integration and manually customize your data, you must manually send CSV files to cost management. If you fail to send the CSV files, cost management will not be able to provide you with any Red Hat Insights or cost data. If you want cost management to automatically pull and process reports, do not select I wish to manually customize the data set sent to cost management in the wizard.
Prerequisites
You must have a Red Hat account with Cloud Administrator permissions before you can add integrations to cost management.
- If you created an unfiltered Google Cloud integration, do not complete the following steps. Your integration is already complete.
- Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud documentation.
2.1. Selecting Google Cloud as your integration provider Copiar o linkLink copiado para a área de transferência!
After you add a Google Cloud integration, send your filtered CSV data to cost management for processing.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
-
From Red Hat Hybrid Cloud Console, click Settings Menu
> Integrations.
- On the Settings page, in the Cloud tab, click .
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click .
- Enter a name for your integration. Click .
- In the Select application step, select Cost management and click .
2.2. Creating a Google Cloud project Copiar o linkLink copiado para a área de transferência!
Create a Google Cloud project to collect and store your cost reports for Red Hat to consume.
Prerequisites
-
You must have the
resourcemanager.projects.create
permission in Google Cloud.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating and managing projects. For reference, the following steps summarize the key points:
-
Click
. - Enter a Project name and select your billing account.
- Select Organization.
- Enter the parent organization in Location.
- Click .
In cost management:
- On the Project page, enter your Project ID.
- To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management.
- Click .
2.3. Creating a Google Cloud bucket Copiar o linkLink copiado para a área de transferência!
Create a bucket for the filtered reports that you will create later. Buckets are containers that store data.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Creating buckets. For reference, the following steps summarize the key points:
-
Go to
. - Click .
- Name your bucket and enter any other information.
- Click , then click .
In cost management:
- On the Create cloud storage bucket page, enter your Cloud storage bucket name.
2.4. Creating a Google Cloud Identity and Access Management role Copiar o linkLink copiado para a área de transferência!
A custom Identity and Access Management (IAM) role for cost management gives access to only the cost-related resources that are required for a Google Cloud integration. It does not give access to any non-essential information.
Prerequisites
You must have the following permissions in Google Cloud Console:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:
-
In Google Cloud Console, click
. - Select the project that you created.
- Click .
- Enter a Title, Description and ID for the role.
- Click .
In Enter property name or value, search for and select the following permissions for your custom role:
-
storage.objects.get
-
storage.objects.list
-
storage.buckets.get
-
- Click and then click .
In cost management:
- In the Add a cloud integration wizard, on the Create IAM role page, click .
2.5. Adding a billing service account member Copiar o linkLink copiado para a área de transferência!
Create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Understanding roles and Creating and managing custom roles. For reference, the following steps summarize the key points:
-
Click
. - Select the project that you created.
- Click .
Paste the following principal into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
billing-export@red-hat-cost-management.iam.gserviceaccount.com
Copy to Clipboard Copied! Toggle word wrap Toggle overflow - In the Assign roles section, enter the IAM role that you created.
- Click .
Verification steps
-
Navigate to
. - Verify that the new member is present with the correct role.
In cost management:
On the Assign access page, click .
2.6. Creating a BigQuery dataset Copiar o linkLink copiado para a área de transferência!
Create a BigQuery dataset to store your billing data.
Prerequisites
You must have the bigquery.datasets.create
permission.
Procedure
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Create a BigQuery dataset. For reference, the following steps summarize the key points:
- Click .
-
In the Explorer panel click the more options menu
next to your project name and click Create dataset.
- Name your dataset.
- Click Create.
2.7. Exporting billing data to BigQuery Copiar o linkLink copiado para a área de transferência!
Configure Google Cloud to send cost and usage billing data automatically to the BigQuery dataset that you created.
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Export Cloud Billing data to BigQuery. For reference, the following steps summarize the key points:
-
Click
. - Click Detailed usage cost section. in the
- Select the cost management Project and Billing export dataset that you created.
- Click .
Verification steps
In the Detailed usage cost section of Google Cloud Console, verify that there is an Enabled checkmark next to the correct Project name and Dataset name.
In cost management:
- In the Add a cloud integration wizard, on the Billing export page, click .
- On the Review details page, review the information about your integration and click .
-
Copy your
source_uuid
so that you can use it when you send API requests to cost management.
2.8. Building a query with the required columns Copiar o linkLink copiado para a área de transferência!
Build a custom query to collect your cost data in a CSV file that you can send to cost management. Name your table in the format of `project.dataset.table_name` and include the backticks. To ensure that cost management can process your CSV file, you must include the following columns:
Example 2.1. Billing and Service columns:
-
billing_account_id
-
service.id
-
service.description
-
sku.id
-
sku.description
Example 2.2. Project columns:
-
project.id
-
project.name
-
project.ancestry_numbers
Example 2.3. Usage columns:
-
usage_start_time
-
usage_end_time
-
usage.amounts
-
usage.unit
-
usage.amount_in_pricing_units
-
usage.pricing_unit
Example 2.4. Location columns:
-
location.location
-
location.country
-
location.region
-
location.zone
Example 2.5. Cost columns:
-
cost
-
currency
-
currency_conversion_rate
-
credits
-
cost_type
Example 2.6. Resource columns:
-
resource.name
-
resource.global_name
Example 2.7. Additional datetime columns:
-
partition_date
-
export_time
You can also include the following optional columns for tag-based cost:
-
project.labels
-
labels
-
system_labels
2.8.1. Example query and customization Copiar o linkLink copiado para a área de transferência!
When you build a query, you should customize it to best fit the needs of your organization. The following example can help guide you, but you should adapt it as necessary for your environment.
The following example query selects all required and optional columns. It also includes a WHERE
clause to limit the amount of data that is queried to the date 2025-04-01. Some columns, like `tags`, are formatted as json strings. Name your table in the format `project.dataset.table_name` with backticks so that you can escape any invalid characters:
SELECT billing_account_id,service.id,service.description,sku.id,sku.description,usage_start_time,usage_end_time,project.id,project.name,TO_JSON_STRING(project.labels),project.ancestry_numbers,TO_JSON_STRING(labels),TO_JSON_STRING(system_labels),location.location,location.country,location.region,location.zone,export_time,cost,currency,currency_conversion_rate,usage.amount,usage.unit,usage.amount_in_pricing_units,usage.pricing_unit,TO_JSON_STRING(credits),invoice.month,cost_type,resource.name,resource.global_name,DATE(_PARTITIONTIME) as partition_date FROM `my-project.my-dataset.my-table` WHERE TIMESTAMP_TRUNC(_PARTITIONTIME, DAY) = TIMESTAMP('2025-04-01')
SELECT
billing_account_id,service.id,service.description,sku.id,sku.description,usage_start_time,usage_end_time,project.id,project.name,TO_JSON_STRING(project.labels),project.ancestry_numbers,TO_JSON_STRING(labels),TO_JSON_STRING(system_labels),location.location,location.country,location.region,location.zone,export_time,cost,currency,currency_conversion_rate,usage.amount,usage.unit,usage.amount_in_pricing_units,usage.pricing_unit,TO_JSON_STRING(credits),invoice.month,cost_type,resource.name,resource.global_name,DATE(_PARTITIONTIME) as partition_date
FROM `my-project.my-dataset.my-table`
WHERE TIMESTAMP_TRUNC(_PARTITIONTIME, DAY) = TIMESTAMP('2025-04-01')
If the example query is not sufficient, you can customize your filtering further with some of the following strategies:
-
Use
WHERE
clauses to filter out specific data. For example,WHERE service.description LIKE '%Red Hat%'
filters out all data that does not have a description containing "Red Hat". -
Use the conjunction and disjunction operators
AND
andOR
to further specify your parameters. - Use the column service.description to filter services like BigQuery or Cloud Logging.
- Use the columns project.id, project.number, and project.name to filter based on your specific project data.
Use location.region to filter data to a specific region.
- Test and Preview your data in BigQuery to ensure you are capturing the correct information before sending it to cost management.
For more information about creating and running queries, see Google’s documentation Create and use tables.
2.9. Exporting CSV data from BiqQuery Copiar o linkLink copiado para a área de transferência!
Google Cloud documentation provides the most up-to-date guidance for working in their console. Follow the instructions in Introduction to data export and Export table data to Cloud Storage. You can also switch coding languages in Google’s table. For reference, the following steps summarize the key points:
- Navigate to BigQuery
- Select your dataset and the table that you created.
- Click Query to test your custom query.
- Save your query.
- Export the query result as a CSV file.
- Store the exported CSV in the bucket that you created.
2.10. Sending a CSV file to cost management Copiar o linkLink copiado para a área de transferência!
To send your CSV file to cost management, request a service account token for API authentication.
-
In the following curl command, replace
my_client_id
andmy_client_secret
with your actual values:
curl --location 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token' -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=my_client_id' --data-urlencode 'client_secret=my_client_secret' --data-urlencode 'grant_type=client_credentials'
curl --location 'https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token' -H 'Content-Type: application/x-www-form-urlencoded' --data-urlencode 'client_id=my_client_id' --data-urlencode 'client_secret=my_client_secret' --data-urlencode 'grant_type=client_credentials'
- Example response:
{"access_token":"ENCRYPTED_TOKEN","expires_in":900,"refresh_expires_in":0,"token_type":"Bearer","not-before-policy":0,"scope":""}
{"access_token":"ENCRYPTED_TOKEN","expires_in":900,"refresh_expires_in":0,"token_type":"Bearer","not-before-policy":0,"scope":""}
-
Send the API request to cost management and indicate which CSV reports are ready to be processed. The following is an example request. In your request, update
ENCRYPTED_TOKEN
with your token,INTEGRATION_ID
with your C.R.C integration ID,FILES_LIST
with your list of CSV files, andYEAR
andMONTH
with the date of the files.
curl -X POST --location https://console.redhat.com/api/cost-management/v1/ingress/reports/' -H 'Authorization: Bearer ENCRYPTED_TOKEN' -H 'Content-Type: application/json' -d '{"source": "my-integration-id", "bill_year": "2025", "bill_month": "03", "reports_list": ["my-file-1.csv", "my-file-2.csv"]}'
curl -X POST --location https://console.redhat.com/api/cost-management/v1/ingress/reports/' -H 'Authorization: Bearer ENCRYPTED_TOKEN' -H 'Content-Type: application/json' -d '{"source": "my-integration-id", "bill_year": "2025", "bill_month": "03", "reports_list": ["my-file-1.csv", "my-file-2.csv"]}'
- Example response:
{'meta': {'count': 8, 'limit': 10, 'offset': 0}, 'links': {'first': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0', 'next': None, 'previous': None, 'last': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0'}, 'data': {'source': 'source_uuid', 'source_id': source_id, 'reports_list': ['my-csv-file.csv'], 'bill_year': '2025', 'bill_month': '03', 'schema_name': 'my-schema', 'ingress_report_uuid': 'report-uuid', 'status': 'pending'}}
{'meta': {'count': 8, 'limit': 10, 'offset': 0}, 'links': {'first': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0', 'next': None, 'previous': None, 'last': '/api/cost-management/v1/ingress/reports/?limit=10&offset=0'}, 'data': {'source': 'source_uuid', 'source_id': source_id, 'reports_list': ['my-csv-file.csv'], 'bill_year': '2025', 'bill_month': '03', 'schema_name': 'my-schema', 'ingress_report_uuid': 'report-uuid', 'status': 'pending'}}
After you send your CSV file, you have successfully integrated with Google Cloud. If you want to automate the process, see Section 3.1, “Using example code snippets to automatically create and send reports” for examples.