Este contenido no está disponible en el idioma seleccionado.

Integrating Google Cloud data into hybrid committed spend


Hybrid committed spend 1-latest

Learn how to add and configure your Google Cloud integrations

Red Hat Customer Content Services

Abstract

You can add a Google Cloud Platform integration to hybrid committed spend.

Part I. Choosing a basic or advanced Google Cloud integration

To create an Google Cloud integration, first decide if you want to take a basic or advanced integration path.

Basic

For the basic option, go to Creating a Google Cloud integration: Basic.

The basic path enables cost management to directly read your billing reports from GCP at a scope that you indicate.

Advanced

For the advanced option, go to Creating a Google Cloud integration: Advanced.

The advanced path enables you to customize or filter your data before cost management reads it. You might also use the advanced path if you want to share billing data only to certain Red Hat products. The advanced path has more complex setup and configuration.

Note

You must select either basic or advanced, you cannot choose both.

Chapter 1. Creating a Google Cloud integration: Basic

You must create a Google cloud integration for hybrid committed spend from the Integrations page and configure your Google Cloud account to allow hybrid committed spend access.

Important

If you want to create a GCP integration by using the advanced path, do not complete the following steps. Instead, go to Creating a Google Cloud integration: Advanced.

You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to hybrid committed spend.

To create a Google Cloud integration, you will complete the following tasks:

  • Create a Google Cloud project for your hybrid committed spend data.
  • Create a bucket for filtered reports.
  • Have a billing service account member with the correct role to export your data to hybrid committed spend.
  • Create a BigQuery dataset to contain the cost data.
  • Create a billing export that sends the hybrid committed spend data to your BigQuery dataset.
Note

Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

1.1. Adding your Google Cloud account as an integration

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the hybrid committed spend application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Hybrid committed spend and click Next.

1.2. Creating a Google Cloud project

Create a Google Cloud project to gather and send your cost reports to Red Hat.

Prerequisites

  • Access to Google Cloud Console with resourcemanager.projects.create permission

Procedure

  1. In the Google Cloud Console click IAM & AdminCreate a Project.
  2. Enter a Project name in the new page that appears and select your billing account.
  3. Select the Organization.
  4. Enter the parent organization in the Location box.
  5. Click Create.

In cost management:

  1. On the Project page, enter your Project ID.
  2. To send the default data to Red Hat automatically, select I am OK with sending the default data set to hybrid committed spend.
  3. Click Next.

1.3. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for hybrid committed spend gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project

Procedure

  1. In the Google Cloud Console, click IAM & AdminRoles.
  2. Select the project you created from the menu.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role. In this example, use customer-data-role.
  5. Click + ADD PERMISSIONS.
  6. Use the Enter property name or value field to search and select the following permissions for your custom role:

    • bigquery.jobs.create
    • bigquery.tables.getData
    • bigquery.tables.get
    • bigquery.tables.list
  7. Click ADD.
  8. Click CREATE.
  9. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Prerequisites

  • You must have access to Google Cloud Console and have the following permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project
  • A hybrid committed spend Identity and Access Management (IAM) role

In the Google Cloud Console:

  1. Click IAM & AdminIAM.
  2. Select the project you created from the menu.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
  5. In the Assign roles section, assign the IAM role you created in Creating a Google Cloud Identity and Access Management role . In this example, use customer-data-role.
  6. Click SAVE.

In the cost management:

  1. On the Assign access page, click Next.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify the new member is present with the correct role.

1.5. Creating a Google Cloud BigQuery dataset

Create a BigQuery dataset to collect and store the billing data for hybrid committed spend.

Prerequisites

  • Access to Google Cloud Console with bigquery.datasets.create permission
  • Google Cloud project

Procedure

  1. In Google Cloud Console, click BigQuery.
  2. In the Explorer panel, select the project you created.
  3. Click the action icon for your project name.
  4. Click CREATE DATASET.
  5. Enter a name for your dataset in the Dataset ID field. In this example, use CustomerData.
  6. Click CREATE DATASET.
  7. In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset you created.
  8. Click Next.

1.6. Exporting Google Cloud billing data to BigQuery

Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the BigQuery dataset you created in the last step.

Prerequisites

Procedure

  1. In the Google Cloud Console, click BillingBilling export.
  2. Click the Billing export tab.
  3. Click EDIT SETTINGS in the Detailed usage cost section.
  4. Select the hybrid committed spend Project and Billing export dataset you created in the dropdown menus.
  5. Click SAVE.
  6. In the Add a cloud integration wizard, on the Billing export page, click Next.
  7. On the Review details page, review the information about your integration and click Add.

Verification steps

  1. Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.

Chapter 2. Creating a Google Cloud integration: Advanced

Create a Google Cloud function script that can filter your billing data, store it in object storage, and send the filtered reports to hybrid committed spend.

Important

If you created an Azure integration by using the basic path, do not complete the following steps. Your Azure integration is already complete.

You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to hybrid committed spend.

To create a Google Cloud integration, you will complete the following tasks:

  • Create a Google Cloud project for your hybrid committed spend data.
  • Create a bucket for filtered reports.
  • Create a billing service account member with the correct role to export your data to hybrid committed spend.
  • Create a BigQuery dataset that contains the cost data.
  • Create a billing export that sends the hybrid committed spend data to your BigQuery dataset.
Note

Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

2.1. Adding your Google Cloud account as an integration

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the hybrid committed spend application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Hybrid committed spend and click Next.

2.2. Creating a Google Cloud project

Create a Google Cloud project to gather and send your cost reports to Red Hat.

Prerequisites

  • Access to Google Cloud Console with resourcemanager.projects.create permission

Procedure

  1. In the Google Cloud Console click IAM & AdminCreate a Project.
  2. Enter a Project name in the new page that appears and select your billing account.
  3. Select the Organization.
  4. Enter the parent organization in the Location box.
  5. Click Create.

In cost management:

  1. On the Project page, enter your Project ID.
  2. To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to hybrid committed spend.
  3. Click Next.

2.3. Creating a Google Cloud bucket

Create a bucket for filtered reports that you will create later. Buckets are containers that store data.

In the Google Cloud Console:

  1. Go to Cloud StorageBuckets.
  2. Click Create.
  3. Enter your bucket information. Name your bucket. In this example, use customer-data.
  4. Click Create, then click Confirm in the confirmation dialog.

In cost management:

  1. On the Create cloud storage bucket page, enter your Cloud storage bucket name.

2.4. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for hybrid committed spend gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project

Procedure

  1. In the Google Cloud Console, click IAM & AdminRoles.
  2. Select the project you created from the menu.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role. In this example, use customer-data-role.
  5. Click + ADD PERMISSIONS.
  6. Use the Enter property name or value field to search and select the following permissions for your custom role:

    • storage.objects.get
    • storage.objects.list
    • storage.buckets.get
  7. Click ADD.
  8. Click CREATE.
  9. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Prerequisites

  • You must have access to Google Cloud Console and have the following permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project
  • A hybrid committed spend Identity and Access Management (IAM) role

In the Google Cloud Console:

  1. Click IAM & AdminIAM.
  2. Select the project you created from the menu.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
  5. In the Assign roles section, assign the IAM role you created in Creating a Google Cloud Identity and Access Management role . In this example, use customer-data-role.
  6. Click SAVE.

In the cost management:

  1. On the Assign access page, click Next.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify the new member is present with the correct role.

2.6. Creating a Google Cloud BigQuery dataset

Create a BigQuery dataset to collect and store the billing data for hybrid committed spend.

Prerequisites

  • Access to Google Cloud Console with bigquery.datasets.create permission
  • Google Cloud project

Procedure

  1. In Google Cloud Console, click BigQuery.
  2. In the Explorer panel, select the project you created.
  3. Click the action icon for your project name.
  4. Click CREATE DATASET.
  5. Enter a name for your dataset in the Dataset ID field. In this example, use CustomerFilteredData.
  6. Click CREATE DATASET.
  7. In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset you created.
  8. Click Next.

2.7. Exporting Google Cloud billing data to BigQuery

Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the BigQuery dataset you created in the last step.

Prerequisites

Procedure

  1. In the Google Cloud Console, click BillingBilling export.
  2. Click the Billing export tab.
  3. Click EDIT SETTINGS in the Detailed usage cost section.
  4. Select the hybrid committed spend Project and Billing export dataset you created in the dropdown menus.
  5. Click SAVE.
  6. In the Add a cloud integration wizard, on the Billing export page, click Next.
  7. On the Review details page, review the information about your integration and click Add.
  8. Copy your source_uuid so that you can use it in the cloud function.

Verification steps

  1. Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.

2.8. Creating a function to post filtered data to your storage bucket

Create a function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account. This script filters the cost data you created with BigQuery, removes non-Red Hat information, then creates .csv files, stores them in the bucket you created, and sends the data to Red Hat.

Prerequisites

In the Google Cloud Console:

  1. Click SecuritySecret manager to set up a secret to authenticate your function with Red Hat without storing your credentials in your function. Enable the Secret Manager if it is not already enabled.
  2. From Secret Manager, click Create secret.

    1. Name your secret, add your service account Client ID, and click Create Secret.
    2. Repeat this process to save a secret for your service account Client secret.
  3. In the Google Cloud Console search bar, search for functions and select the Cloud Functions result.
  4. On the Cloud Functions page, click Create function.
  5. Name the function. In this example, use customer-data-function.
  6. In the Trigger section, select HTTPS as the trigger type.
  7. In Runtime, build, connections and security settings, click the Security and image repo tab.

    1. Click Add a secret reference.
    2. Select the client_id secret you created before.
    3. Set the reference method to Exposed as environment variable.
    4. Name the exposed environment variable client_id.
    5. Click Done.
  8. Repeat the previous steps for your client_secret.
  9. Click Next.
  10. On the Cloud Functions Code page, set the runtime to the latest Python version available.
  11. Open the requirements.txt file. Paste the following lines at the end of the file.

    requests
    google-cloud-bigquery
    google-cloud-storage
  12. Set the Entry Point to get_filtered_data.
  13. Open the main.py file.

    1. Paste the following python script. Change the values in the section marked # Required vars to update to the values for your environment. Update the values for the following lines:

      INTEGRATION_ID
      Cost management integration_id
      BUCKET
      Filtered data GCP Bucket
      PROJECT_ID
      Your project ID
      DATASET
      Your dataset name
      TABLE_ID
      Your table ID
  14. Click Deploy.

Create a scheduler job to run the function you created to send filtered data to Red Hat on a schedule.

Procedure

  1. Copy the Trigger URL for the function you created to post the cost reports. You will need to add it to the Google Cloud Scheduler.

    1. In the Google Cloud Console, search for functions and select the Cloud Functions result.
    2. On the Cloud Functions page, select your function, and click the Trigger tab.
    3. In the HTTP section, click Copy to clipboard.
  2. Create the scheduler job. In the Google Cloud Console, search for cloud scheduler and select the Cloud Scheduler result.
  3. Click Create job.

    1. Name your scheduler job. In this example, use CustomerFilteredDataSchedule.
    2. In the Frequency field, set the cron expression for when you want the function to run. In this example, use 0 9 * * * to run the function daily at 9 AM.
    3. Set the time zone and click Continue.
  4. Configure the execution on the next page.

    1. In the Target type field, select HTTP.
    2. In the URL field, paste the Trigger URL you copied.
    3. In the body field, paste the following code that passes into the function to trigger it.

      {"name": "Scheduler"}
    4. In the Auth header field, select Add OIDC token.
    5. Click the Service account field and click Create to create a service account and role for the scheduler job.
  5. In the Service account details step, name your service account. In this example, use scheduler-service-account. Accept the default Service account ID and click Create and Continue.

    1. In the Grant this service account access to project field, search for and select Cloud Scheduler Job Runner as the first role.
    2. Click ADD ANOTHER ROLE, then search for and select Cloud Functions Invoker.
    3. Click Continue.
    4. Click Done to finish creating the service account.
  6. Go back to the Cloud scheduler tab.
  7. In the Configure the execution page, select the Service account field.
  8. Refresh the page and select the scheduler you just created.
  9. Click Continue and then click Create.

After completing these steps, you have successfully set up your Google Cloud function to send reports to Red Hat. For next steps, refer to Chapter 3, Next steps for managing your costs.

2.10. Creating additional cloud functions to collect finalized data

At the beginning of the month, Google Cloud finalizes the bill for the month before. Create an additional function and scheduled job to trigger it to send these reports to Red Hat so cost management can process them.

Procedure

  1. Set up a function to post reports:

    1. From Cloud Functions, select Create function.
    2. Name your function.
    3. Select HTTP trigger.
  2. In Runtime, build, connections, security settings, click Security.

    1. Click Reference secret.
    2. Select Exposed as environment variable.
    3. Select Secret version or Latest.
    4. Click Done.
    5. Repeat the process for your other secrets.
  3. Click Save.
  4. Copy your Trigger URL. Click Next.
  5. Select the latest Python runtime.
  6. Set Entry point to get_filtered_data.
  7. Add your Google Cloud function. Update the values for INTEGRATION_ID, BUCKET, PROJECT_ID, DATASET, and TABLE_ID.
  8. Remove the comments from the following lines:

     # month_end = now.replace(day=1) - timedelta(days=1) # delta = now.replace(day=1) - timedelta(days=query_range) # year = month_end.strftime("%Y") # month = month_end.strftime("%m") # day = month_end.strftime("%d")
  9. Select the requirements.py file and add the requirements from the requirements.txt file.
  10. Click Deploy.
  11. Set up a cloud scheduler to trigger your function:

    1. Go to Cloud Scheduler.
    2. Click Schedule a job.
    3. Name your schedule
    4. Set the frequency. For example, the following cron will run the job on the fourth day of every month, 0 9 4 * *
    5. Set a Time zone.
    6. Click Continue.
    7. Paste the function Trigger URL you copied earlier.
    8. In the request body, add {"name": "Scheduler"}.
    9. Set the auth header to OIDC token.
    10. Select or create a service account with the Cloud Scheudler Job Runner and Cloud Functions Invoker roles.
    11. Click Continue.
    12. Click Save.

Providing feedback on Red Hat documentation

We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.

Prerequisites

  • You are logged in to the Red Hat Customer Portal.

Procedure

To provide feedback, perform the following steps:

  1. Click the following link: Create Issue.
  2. Describe the issue or enhancement in the Summary text box.
  3. Provide details about the issue or requested enhancement in the Description text box.
  4. Type your name in the Reporter text box.
  5. Click the Create button.

This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.

Legal Notice

Copyright © 2024 Red Hat, Inc.
The text of and illustrations in this document are licensed by Red Hat under a Creative Commons Attribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is available at http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute this document or an adaptation of it, you must provide the URL for the original version.
Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert, Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.
Red Hat, Red Hat Enterprise Linux, the Shadowman logo, the Red Hat logo, JBoss, OpenShift, Fedora, the Infinity logo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.
Linux® is the registered trademark of Linus Torvalds in the United States and other countries.
Java® is a registered trademark of Oracle and/or its affiliates.
XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
Node.js® is an official trademark of Joyent. Red Hat is not formally related to or endorsed by the official Joyent Node.js open source or commercial project.
The OpenStack® Word Mark and OpenStack logo are either registered trademarks/service marks or trademarks/service marks of the OpenStack Foundation, in the United States and other countries and are used with the OpenStack Foundation's permission. We are not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.
All other trademarks are the property of their respective owners.
Red Hat logoGithubredditYoutubeTwitter

Aprender

Pruebe, compre y venda

Comunidades

Acerca de la documentación de Red Hat

Ayudamos a los usuarios de Red Hat a innovar y alcanzar sus objetivos con nuestros productos y servicios con contenido en el que pueden confiar. Explore nuestras recientes actualizaciones.

Hacer que el código abierto sea más inclusivo

Red Hat se compromete a reemplazar el lenguaje problemático en nuestro código, documentación y propiedades web. Para más detalles, consulte el Blog de Red Hat.

Acerca de Red Hat

Ofrecemos soluciones reforzadas que facilitan a las empresas trabajar en plataformas y entornos, desde el centro de datos central hasta el perímetro de la red.

Theme

© 2026 Red Hat
Volver arriba