Search

Integrating Google Cloud data into cost management

download PDF
Cost Management Service 1-latest

Learn how to add and configure your Google Cloud integration

Red Hat Customer Content Services

Abstract

Learn how to add a Google Cloud integration to cost management. Cost management is part of the Red Hat Insights portfolio of services. The Red Hat Insights suite of advanced analytical tools helps you to identify and prioritize impacts on your operations, security, and business.

Part I. Choosing a basic or advanced Google Cloud integration

To create an Google Cloud integration, first decide if you want to take a basic or advanced integration path.

Basic

For the basic option, go to Creating a Google Cloud integration: Basic.

The basic path enables cost management to directly read your billing reports from GCP at a scope that you indicate.

Advanced

For the advanced option, go to Creating a Google Cloud integration: Advanced.

The advanced path enables you to customize or filter your data before cost management reads it. You might also use the advanced path if you want to share billing data only to certain Red Hat products. The advanced path has more complex setup and configuration.

Note

You must select either basic or advanced, you cannot choose both.

Chapter 1. Creating a Google Cloud integration: Basic

You must create a Google cloud integration for cost management from the Integrations page and configure your Google Cloud account to allow cost management access.

Important

If you want to create a GCP integration by using the advanced path, do not complete the following steps. Instead, go to Creating a Google Cloud integration: Advanced.

You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to cost management.

To create a Google Cloud integration, you will complete the following tasks:

  • Create a Google Cloud project for your cost management data.
  • Create a bucket for filtered reports.
  • Have a billing service account member with the correct role to export your data to cost management.
  • Create a BigQuery dataset to contain the cost data.
  • Create a billing export that sends the cost management data to your BigQuery dataset.
Note

Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

1.1. Adding your Google Cloud account as an integration

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

1.2. Creating a Google Cloud project

Create a Google Cloud project to gather and send your cost reports to Red Hat.

Prerequisites

  • Access to Google Cloud Console with resourcemanager.projects.create permission

Procedure

  1. In the Google Cloud Console click IAM & AdminCreate a Project.
  2. Enter a Project name in the new page that appears and select your billing account.
  3. Select the Organization.
  4. Enter the parent organization in the Location box.
  5. Click Create.

In cost management:

  1. On the Project page, enter your Project ID.
  2. Select I am OK with sending the default data set to cost management.
  3. Click Next.

Additional resources

1.3. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for cost management gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project

Procedure

  1. In the Google Cloud Console, click IAM & AdminRoles.
  2. Select the project you created from the menu.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role. In this example, use customer-data-role.
  5. Click + ADD PERMISSIONS.
  6. Use the Enter property name or value field to search and select the following permissions for your custom role:

    • bigquery.jobs.create
    • bigquery.tables.getData
    • bigquery.tables.get
    • bigquery.tables.list
  7. Click ADD.
  8. Click CREATE.
  9. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

Additional resources

1.4. Adding a billing service account member to your Google Cloud project

You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Prerequisites

  • You must have access to Google Cloud Console and have the following permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project
  • A cost management Identity and Access Management (IAM) role

In the Google Cloud Console:

  1. Click IAM & AdminIAM.
  2. Select the project you created from the menu.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
  5. In the Assign roles section, assign the IAM role you created in Creating a Google Cloud Identity and Access Management role . In this example, use customer-data-role.
  6. Click SAVE.

In the cost management:

  1. On the Assign access page, click Next.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify the new member is present with the correct role.

Additional resources

1.5. Creating a Google Cloud BigQuery dataset

Create a BigQuery dataset to collect and store the billing data for cost management.

Prerequisites

  • Access to Google Cloud Console with bigquery.datasets.create permission
  • Google Cloud project

Procedure

  1. In Google Cloud Console, click BigQuery.
  2. In the Explorer panel, select the project you created.
  3. Click the action icon for your project name.
  4. Click CREATE DATASET.
  5. Enter a name for your dataset in the Dataset ID field. In this example, use CustomerData.
  6. Click CREATE DATASET.
  7. In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset you created.
  8. Click Next.

1.6. Exporting Google Cloud billing data to BigQuery

Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the BigQuery dataset you created in the last step.

Prerequisites

Procedure

  1. In the Google Cloud Console, click BillingBilling export.
  2. Click the Billing export tab.
  3. Click EDIT SETTINGS in the Detailed usage cost section.
  4. Select the cost management Project and Billing export dataset you created in the dropdown menus.
  5. Click SAVE.
  6. In the Add a cloud integration wizard, on the Billing export page, click Next.
  7. On the Review details page, review the information about your integration and click Add.

Verification steps

  1. Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.

1.6.1. Viewing billing tables in BigQuery

You may want to review the metrics collected and sent to cost management. This can also assist with troubleshooting incorrect or missing data in cost management.

Note

Google may take several hours to export billing data to your BigQuery dataset.

Prerequisites

  • Access to Google Cloud console with bigquery.dataViewer role

Procedure

  1. Navigate to Big DataBigQuery in Google Cloud Console.
  2. Select the cost management project in the Explorer panel.
  3. Click gcp_billing_export_v1_xxxxxx_xxxxxx_xxxxxx table under the cost management dataset.
  4. Click the Preview tab to view the metrics.

Chapter 2. Creating a Google Cloud integration: Advanced

Create a Google Cloud function script that can filter your billing data, store it in object storage, and send the filtered reports to cost management.

Important

If you created an Azure integration by using the basic path, do not complete the following steps. Your Azure integration is already complete.

You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to cost management.

To create a Google Cloud integration, you will complete the following tasks:

  • Create a Google Cloud project for your cost management data.
  • Create a bucket for filtered reports.
  • Create a billing service account member with the correct role to export your data to cost management.
  • Create a BigQuery dataset that contains the cost data.
  • Create a billing export that sends the cost management data to your BigQuery dataset.
Note

Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.

2.1. Adding your Google Cloud account as an integration

You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.

Prerequisites

  • To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
  4. Enter a name for your integration. Click Next.
  5. In the Select application step, select Cost management and click Next.

2.2. Creating a Google Cloud project

Create a Google Cloud project to gather and send your cost reports to Red Hat.

Prerequisites

  • Access to Google Cloud Console with resourcemanager.projects.create permission

Procedure

  1. In the Google Cloud Console click IAM & AdminCreate a Project.
  2. Enter a Project name in the new page that appears and select your billing account.
  3. Select the Organization.
  4. Enter the parent organization in the Location box.
  5. Click Create.

In cost management:

  1. Pn the Project page, enter your Project ID.
  2. To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management.
  3. Click Next.

Additional resources

2.3. Creating a Google Cloud bucket

Create a bucket for filtered reports that you will create later. Buckets are containers that store data.

In the Google Cloud Console:

  1. Go to Cloud StorageBuckets.
  2. Click Create.
  3. Enter your bucket information. Name your bucket. In this example, use customer-data.
  4. Click Create, then click Confirm in the confirmation dialog.

In cost management:

  1. On the Create cloud storage bucket page, enter your Cloud storage bucket name.

Additional resources

  • For additional information about creating buckets, see the Google Cloud documentation on Creating buckets.

2.4. Creating a Google Cloud Identity and Access Management role

A custom Identity and Access Management (IAM) role for cost management gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.

Prerequisites

  • Access to Google Cloud Console with these permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project

Procedure

  1. In the Google Cloud Console, click IAM & AdminRoles.
  2. Select the project you created from the menu.
  3. Click + Create role.
  4. Enter a Title, Description and ID for the role. In this example, use customer-data-role.
  5. Click + ADD PERMISSIONS.
  6. Use the Enter property name or value field to search and select the following permissions for your custom role:

    • storage.objects.get
    • storage.objects.list
    • storage.buckets.get
  7. Click ADD.
  8. Click CREATE.
  9. In the Add a cloud integration wizard, on the Create IAM role page, click Next.

Additional resources

2.5. Adding a billing service account member to your Google Cloud project

You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.

Prerequisites

  • You must have access to Google Cloud Console and have the following permissions:

    • resourcemanager.projects.get
    • resourcemanager.projects.getIamPolicy
    • resourcemanager.projects.setIamPolicy
  • Google Cloud project
  • A cost management Identity and Access Management (IAM) role

In the Google Cloud Console:

  1. Click IAM & AdminIAM.
  2. Select the project you created from the menu.
  3. Click Grant Access.
  4. Paste the following principal into the New principals field:

    billing-export@red-hat-cost-management.iam.gserviceaccount.com
  5. In the Assign roles section, assign the IAM role you created in Creating a Google Cloud Identity and Access Management role . In this example, use customer-data-role.
  6. Click SAVE.

In the cost management:

  1. On the Assign access page, click Next.

Verification steps

  1. Navigate to IAM & AdminIAM.
  2. Verify the new member is present with the correct role.

Additional resources

2.6. Creating a Google Cloud BigQuery dataset

Create a BigQuery dataset to collect and store the billing data for cost management.

Prerequisites

  • Access to Google Cloud Console with bigquery.datasets.create permission
  • Google Cloud project

Procedure

  1. In Google Cloud Console, click BigQuery.
  2. In the Explorer panel, select the project you created.
  3. Click the action icon for your project name.
  4. Click CREATE DATASET.
  5. Enter a name for your dataset in the Dataset ID field. In this example, use CustomerFilteredData.
  6. Click CREATE DATASET.
  7. In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset you created.
  8. Click Next.

2.7. Exporting Google Cloud billing data to BigQuery

Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the BigQuery dataset you created in the last step.

Prerequisites

Procedure

  1. In the Google Cloud Console, click BillingBilling export.
  2. Click the Billing export tab.
  3. Click EDIT SETTINGS in the Detailed usage cost section.
  4. Select the cost management Project and Billing export dataset you created in the dropdown menus.
  5. Click SAVE.
  6. In the Add a cloud integration wizard, on the Billing export page, click Next.
  7. On the Review details page, review the information about your integration and click Add.
  8. Copy your source_uuid so that you can use it in the cloud function.

Verification steps

  1. Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.

2.8. Creating a function to post filtered data to your storage bucket

Create a function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account. This script filters the cost data you created with BigQuery, removes non-Red Hat information, then creates .csv files, stores them in the bucket you created, and sends the data to Red Hat.

Prerequisites

In the Google Cloud Console:

  1. Click SecuritySecret manager to set up a secret to authenticate your function with Red Hat without storing your credentials in your function. Enable the Secret Manager if it is not already enabled.
  2. From Secret Manager, click Create secret.

    1. Name your secret, add your service account Client ID, and click Create Secret.
    2. Repeat this process to save a secret for your service account Client secret.
  3. In the Google Cloud Console search bar, search for functions and select the Cloud Functions result.
  4. On the Cloud Functions page, click Create function.
  5. Name the function. In this example, use customer-data-function.
  6. In the Trigger section, select HTTPS as the trigger type.
  7. In Runtime, build, connections and security settings, click the Security and image repo tab.

    1. Click Add a secret reference.
    2. Select the client_id secret you created before.
    3. Set the reference method to Exposed as environment variable.
    4. Name the exposed environment variable client_id.
    5. Click Done.
  8. Repeat the previous steps for your client_secret.
  9. Click Next.
  10. On the Cloud Functions Code page, set the runtime to the latest Python version available.
  11. Open the requirements.txt file. Paste the following lines at the end of the file.

    requests
    google-cloud-bigquery
    google-cloud-storage
  12. Set the Entry Point to get_filtered_data.
  13. Open the main.py file.

    1. Paste the following python script. Change the values in the section marked # Required vars to update to the values for your environment. Update the values for the following lines:

      INTEGRATION_ID
      Cost management integration_id
      BUCKET
      Filtered data GCP Bucket
      PROJECT_ID
      Your project ID
      DATASET
      Your dataset name
      TABLE_ID
      Your table ID
  14. Click Deploy.

2.9. Trigger your function to post filtered data to your storage bucket

Create a scheduler job to run the function you created to send filtered data to Red Hat on a schedule.

Procedure

  1. Copy the Trigger URL for the function you created to post the cost reports. You will need to add it to the Google Cloud Scheduler.

    1. In the Google Cloud Console, search for functions and select the Cloud Functions result.
    2. On the Cloud Functions page, select your function, and click the Trigger tab.
    3. In the HTTP section, click Copy to clipboard.
  2. Create the scheduler job. In the Google Cloud Console, search for cloud scheduler and select the Cloud Scheduler result.
  3. Click Create job.

    1. Name your scheduler job. In this example, use CustomerFilteredDataSchedule.
    2. In the Frequency field, set the cron expression for when you want the function to run. In this example, use 0 9 * * * to run the function daily at 9 AM.
    3. Set the time zone and click Continue.
  4. Configure the execution on the next page.

    1. In the Target type field, select HTTP.
    2. In the URL field, paste the Trigger URL you copied.
    3. In the body field, paste the following code that passes into the function to trigger it.

      {"name": "Scheduler"}
    4. In the Auth header field, select Add OIDC token.
    5. Click the Service account field and click Create to create a service account and role for the scheduler job.
  5. In the Service account details step, name your service account. In this example, use scheduler-service-account. Accept the default Service account ID and click Create and Continue.

    1. In the Grant this service account access to project field, search for and select Cloud Scheduler Job Runner as the first role.
    2. Click ADD ANOTHER ROLE, then search for and select Cloud Functions Invoker.
    3. Click Continue.
    4. Click Done to finish creating the service account.
  6. Go back to the Cloud scheduler tab.
  7. In the Configure the execution page, select the Service account field.
  8. Refresh the page and select the scheduler you just created.
  9. Click Continue and then click Create.

After completing these steps, you have successfully set up your Google Cloud function to send reports to Red Hat. For next steps, refer to Chapter 3, Next steps for managing your costs.

2.10. Creating additional cloud functions to collect finalized data

At the beginning of the month, Google Cloud finalizes the bill for the month before. Create an additional function and scheduled job to trigger it to send these reports to Red Hat so cost management can process them.

Procedure

  1. Set up a function to post reports:

    1. From Cloud Functions, select Create function.
    2. Name your function.
    3. Select HTTP trigger.
  2. In Runtime, build, connections, security settings, click Security.

    1. Click Reference secret.
    2. Select Exposed as environment variable.
    3. Select Secret version or Latest.
    4. Click Done.
    5. Repeat the process for your other secrets.
  3. Click Save.
  4. Copy your Trigger URL. Click Next.
  5. Select the latest Python runtime.
  6. Set Entry point to get_filtered_data.
  7. Add your Google Cloud function. Update the values for INTEGRATION_ID, BUCKET, PROJECT_ID, DATASET, and TABLE_ID.
  8. Remove the comments from the following lines:

     # month_end = now.replace(day=1) - timedelta(days=1) # delta = now.replace(day=1) - timedelta(days=query_range) # year = month_end.strftime("%Y") # month = month_end.strftime("%m") # day = month_end.strftime("%d")
  9. Select the requirements.py file and add the requirements from the requirements.txt file.
  10. Click Deploy.
  11. Set up a cloud scheduler to trigger your function:

    1. Go to Cloud Scheduler.
    2. Click Schedule a job.
    3. Name your schedule
    4. Set the frequency. For example, the following cron will run the job on the fourth day of every month, 0 9 4 * *
    5. Set a Time zone.
    6. Click Continue.
    7. Paste the function Trigger URL you copied earlier.
    8. In the request body, add {"name": "Scheduler"}.
    9. Set the auth header to OIDC token.
    10. Select or create a service account with the Cloud Scheudler Job Runner and Cloud Functions Invoker roles.
    11. Click Continue.
    12. Click Save.

Chapter 3. Next steps for managing your costs

After adding your OpenShift Container Platform and Google Cloud integration, on the cost management Overview page, your cost data is sorted into OpenShift and Infrastructure tabs. Select Perspective to toggle through different views of your cost data.

You can also use the global navigation menu to view additional details about your costs by cloud provider.

3.1. Limiting access to cost management resources

After you add and configure integrations in cost management, you can limit access to cost data and resources.

You might not want users to have access to all of your cost data. Instead, you can grant users access only to data that is specific to their projects or organizations. With role-based access control, you can limit the visibility of resources in cost management reports. For example, you can restrict a user’s view to only AWS integrations, rather than the entire environment.

To learn how to limit access, see the more in-depth guide Limiting access to cost management resources.

3.2. Configuring tagging for your integrations

The cost management application tracks cloud and infrastructure costs with tags. Tags are also known as labels in OpenShift.

You can refine tags in cost management to filter and attribute resources, organize your resources by cost, and allocate costs to different parts of your cloud infrastructure.

Important

You can only configure tags and labels directly on an integration. You can choose the tags that you activate in cost management, however, you cannot edit tags and labels in the cost management application.

To learn more about the following topics, see Managing cost data using tagging:

  • Planning your tagging strategy to organize your view of cost data
  • Understanding how cost management associates tags
  • Configuring tags and labels on your integrations

3.3. Configuring cost models to accurately report costs

Now that you configured your integrations to collect cost and usage data in cost management, you can configure cost models to associate prices to metrics and usage.

A cost model is a framework that uses raw costs and metrics to define calculations for the costs in cost management. You can record, categorize, and distribute the costs that the cost model generates to specific customers, business units, or projects.

In Cost Models, you can complete the following tasks:

  • Classifying your costs as infrastructure or supplementary costs
  • Capturing monthly costs for OpenShift nodes and clusters
  • Applying a markup to account for additional support costs

To learn how to configure a cost model, see Using cost models.

3.4. Visualizing your costs with Cost Explorer

Use cost management Cost Explorer to create custom graphs of time-scaled cost and usage information and ultimately better visualize and interpret your costs.

To learn more about the following topics, see Visualizing your costs using Cost Explorer:

  • Using Cost Explorer to identify abnormal events
  • Understanding how your cost data changes over time
  • Creating custom bar charts of your cost and usage data
  • Exporting custom cost data tables

Providing feedback on Red Hat documentation

We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.

Prerequisites

  • You are logged in to the Red Hat Customer Portal.

Procedure

To provide feedback, perform the following steps:

  1. Click the following link: Create Issue.
  2. Describe the issue or enhancement in the Summary text box.
  3. Provide details about the issue or requested enhancement in the Description text box.
  4. Type your name in the Reporter text box.
  5. Click the Create button.

This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.

Legal Notice

Copyright © 2024 Red Hat, Inc.
The text of and illustrations in this document are licensed by Red Hat under a Creative Commons Attribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is available at http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute this document or an adaptation of it, you must provide the URL for the original version.
Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert, Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.
Red Hat, Red Hat Enterprise Linux, the Shadowman logo, the Red Hat logo, JBoss, OpenShift, Fedora, the Infinity logo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.
Linux® is the registered trademark of Linus Torvalds in the United States and other countries.
Java® is a registered trademark of Oracle and/or its affiliates.
XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
Node.js® is an official trademark of Joyent. Red Hat is not formally related to or endorsed by the official Joyent Node.js open source or commercial project.
The OpenStack® Word Mark and OpenStack logo are either registered trademarks/service marks or trademarks/service marks of the OpenStack Foundation, in the United States and other countries and are used with the OpenStack Foundation's permission. We are not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.
All other trademarks are the property of their respective owners.
Red Hat logoGithubRedditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

© 2024 Red Hat, Inc.