Integrating Google Cloud data into cost management
Learn how to add and configure your Google Cloud integration
Abstract
Part I. Choosing a basic or advanced Google Cloud integration
To create an Google Cloud integration, first decide if you want to take a basic or advanced integration path.
Basic
For the basic option, go to Creating a Google Cloud integration: Basic.
The basic path enables cost management to directly read your billing reports from GCP at a scope that you indicate.
Advanced
For the advanced option, go to Creating a Google Cloud integration: Advanced.
The advanced path enables you to customize or filter your data before cost management reads it. You might also use the advanced path if you want to share billing data only to certain Red Hat products. The advanced path has more complex setup and configuration.
You must select either basic or advanced, you cannot choose both.
Chapter 1. Creating a Google Cloud integration: Basic
You must create a Google cloud integration for cost management from the Integrations page and configure your Google Cloud account to allow cost management access.
If you want to create a GCP integration by using the advanced path, do not complete the following steps. Instead, go to Creating a Google Cloud integration: Advanced.
You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to cost management.
To create a Google Cloud integration, you will complete the following tasks:
- Create a Google Cloud project for your cost management data.
- Create a bucket for filtered reports.
- Have a billing service account member with the correct role to export your data to cost management.
- Create a BigQuery dataset to contain the cost data.
- Create a billing export that sends the cost management data to your BigQuery dataset.
Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.
1.1. Adding your Google Cloud account as an integration
You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click .
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click .
- Enter a name for your integration. Click .
- In the Select application step, select Cost management and click .
1.2. Creating a Google Cloud project
Create a Google Cloud project to gather and send your cost reports to Red Hat.
Prerequisites
-
Access to Google Cloud Console with
resourcemanager.projects.create
permission
Procedure
- In the Google Cloud Console click → .
- Enter a Project name in the new page that appears and select your billing account.
- Select the Organization.
- Enter the parent organization in the Location box.
- Click .
In cost management:
- On the Project page, enter your Project ID.
- Select I am OK with sending the default data set to cost management.
- Click .
Additional resources
- For additional information about creating projects, see the Google Cloud documentation Creating and managing projects.
1.3. Creating a Google Cloud Identity and Access Management role
A custom Identity and Access Management (IAM) role for cost management gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.
Prerequisites
Access to Google Cloud Console with these permissions:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
-
- Google Cloud project
Procedure
- In the Google Cloud Console, click → .
- Select the project you created from the menu.
- Click .
-
Enter a Title, Description and ID for the role. In this example, use
customer-data-role
. - Click .
Use the Enter property name or value field to search and select the following permissions for your custom role:
-
bigquery.jobs.create
-
bigquery.tables.getData
-
bigquery.tables.get
-
bigquery.tables.list
-
- Click .
- Click .
- In the Add a cloud integration wizard, on the Create IAM role page, click .
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
1.4. Adding a billing service account member to your Google Cloud project
You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Prerequisites
In the Google Cloud Console:
- Click → .
- Select the project you created from the menu.
- Click .
Paste the following principal into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
-
In the Assign roles section, assign the IAM role you created in Creating a Google Cloud Identity and Access Management role . In this example, use
customer-data-role
. - Click .
In the cost management:
- On the Assign access page, click .
Verification steps
- Navigate to → .
- Verify the new member is present with the correct role.
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
1.5. Creating a Google Cloud BigQuery dataset
Create a BigQuery dataset to collect and store the billing data for cost management.
Prerequisites
-
Access to Google Cloud Console with
bigquery.datasets.create
permission - Google Cloud project
Procedure
- In Google Cloud Console, click .
- In the Explorer panel, select the project you created.
- Click the action icon for your project name.
- Click .
-
Enter a name for your dataset in the Dataset ID field. In this example, use
CustomerData
. - Click .
- In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset you created.
- Click .
1.6. Exporting Google Cloud billing data to BigQuery
Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the BigQuery dataset you created in the last step.
Prerequisites
- Access to Google Cloud Console with the Billing Account Administrator role
- Google Cloud project
- Billing service member with the cost management Identity and Access Management (IAM) role
- BigQuery dataset
Procedure
- In the Google Cloud Console, click → .
- Click the Billing export tab.
- Click Detailed usage cost section. in the
- Select the cost management Project and Billing export dataset you created in the dropdown menus.
- Click .
- In the Add a cloud integration wizard, on the Billing export page, click .
- On the Review details page, review the information about your integration and click .
Verification steps
- Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.
1.6.1. Viewing billing tables in BigQuery
You may want to review the metrics collected and sent to cost management. This can also assist with troubleshooting incorrect or missing data in cost management.
Google may take several hours to export billing data to your BigQuery dataset.
Prerequisites
-
Access to Google Cloud console with
bigquery.dataViewer
role
Procedure
- Navigate to Google Cloud Console. → in
- Select the cost management project in the Explorer panel.
-
Click
gcp_billing_export_v1_xxxxxx_xxxxxx_xxxxxx
table under the cost management dataset. - Click the Preview tab to view the metrics.
Chapter 2. Creating a Google Cloud integration: Advanced
Create a Google Cloud function script that can filter your billing data, store it in object storage, and send the filtered reports to cost management.
If you created an Azure integration by using the basic path, do not complete the following steps. Your Azure integration is already complete.
You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to cost management.
To create a Google Cloud integration, you will complete the following tasks:
- Create a Google Cloud project for your cost management data.
- Create a bucket for filtered reports.
- Create a billing service account member with the correct role to export your data to cost management.
- Create a BigQuery dataset that contains the cost data.
- Create a billing export that sends the cost management data to your BigQuery dataset.
Google Cloud is a third-party product and its console and documentation can change. The instructions for configuring the third-party integrations are correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.
2.1. Adding your Google Cloud account as an integration
You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the cost management application processes the cost and usage data from your Google Cloud account and makes it viewable.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click .
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click .
- Enter a name for your integration. Click .
- In the Select application step, select Cost management and click .
2.2. Creating a Google Cloud project
Create a Google Cloud project to gather and send your cost reports to Red Hat.
Prerequisites
-
Access to Google Cloud Console with
resourcemanager.projects.create
permission
Procedure
- In the Google Cloud Console click → .
- Enter a Project name in the new page that appears and select your billing account.
- Select the Organization.
- Enter the parent organization in the Location box.
- Click .
In cost management:
- Pn the Project page, enter your Project ID.
- To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to cost management.
- Click .
Additional resources
- For additional information about creating projects, see the Google Cloud documentation Creating and managing projects.
2.3. Creating a Google Cloud bucket
Create a bucket for filtered reports that you will create later. Buckets are containers that store data.
In the Google Cloud Console:
- Go to → .
- Click .
-
Enter your bucket information. Name your bucket. In this example, use
customer-data
. - Click , then click in the confirmation dialog.
In cost management:
- On the Create cloud storage bucket page, enter your Cloud storage bucket name.
Additional resources
- For additional information about creating buckets, see the Google Cloud documentation on Creating buckets.
2.4. Creating a Google Cloud Identity and Access Management role
A custom Identity and Access Management (IAM) role for cost management gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.
Prerequisites
Access to Google Cloud Console with these permissions:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
-
- Google Cloud project
Procedure
- In the Google Cloud Console, click → .
- Select the project you created from the menu.
- Click .
-
Enter a Title, Description and ID for the role. In this example, use
customer-data-role
. - Click .
Use the Enter property name or value field to search and select the following permissions for your custom role:
-
storage.objects.get
-
storage.objects.list
-
storage.buckets.get
-
- Click .
- Click .
- In the Add a cloud integration wizard, on the Create IAM role page, click .
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
2.5. Adding a billing service account member to your Google Cloud project
You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Prerequisites
In the Google Cloud Console:
- Click → .
- Select the project you created from the menu.
- Click .
Paste the following principal into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
-
In the Assign roles section, assign the IAM role you created in Creating a Google Cloud Identity and Access Management role . In this example, use
customer-data-role
. - Click .
In the cost management:
- On the Assign access page, click .
Verification steps
- Navigate to → .
- Verify the new member is present with the correct role.
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
2.6. Creating a Google Cloud BigQuery dataset
Create a BigQuery dataset to collect and store the billing data for cost management.
Prerequisites
-
Access to Google Cloud Console with
bigquery.datasets.create
permission - Google Cloud project
Procedure
- In Google Cloud Console, click .
- In the Explorer panel, select the project you created.
- Click the action icon for your project name.
- Click .
-
Enter a name for your dataset in the Dataset ID field. In this example, use
CustomerFilteredData
. - Click .
- In the Add a cloud integration wizard, on the Create dataset page, enter the name of the dataset you created.
- Click .
2.7. Exporting Google Cloud billing data to BigQuery
Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the BigQuery dataset you created in the last step.
Prerequisites
- Access to Google Cloud Console with the Billing Account Administrator role
- Google Cloud project
- Billing service member with the cost management Identity and Access Management (IAM) role
- BigQuery dataset
Procedure
- In the Google Cloud Console, click → .
- Click the Billing export tab.
- Click Detailed usage cost section. in the
- Select the cost management Project and Billing export dataset you created in the dropdown menus.
- Click .
- In the Add a cloud integration wizard, on the Billing export page, click .
- On the Review details page, review the information about your integration and click .
-
Copy your
source_uuid
so that you can use it in the cloud function.
Verification steps
- Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.
2.8. Creating a function to post filtered data to your storage bucket
Create a function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account. This script filters the cost data you created with BigQuery, removes non-Red Hat information, then creates .csv
files, stores them in the bucket you created, and sends the data to Red Hat.
Prerequisites
- You must have a Red Hat Hybrid Cloud Console service account.
- You must have enabled the API service in GCP.
In the Google Cloud Console:
- Click Secret Manager if it is not already enabled. → to set up a secret to authenticate your function with Red Hat without storing your credentials in your function. Enable the
From Secret Manager, click .
- Name your secret, add your service account Client ID, and click .
- Repeat this process to save a secret for your service account Client secret.
-
In the Google Cloud Console search bar, search for
functions
and select the Cloud Functions result. - On the Cloud Functions page, click .
-
Name the function. In this example, use
customer-data-function
. - In the Trigger section, select HTTPS as the trigger type.
In Runtime, build, connections and security settings, click the Security and image repo tab.
- Click .
-
Select the
client_id
secret you created before. - Set the reference method to Exposed as environment variable.
-
Name the exposed environment variable
client_id
. - Click .
-
Repeat the previous steps for your
client_secret
. - Click .
- On the Cloud Functions Code page, set the runtime to the latest Python version available.
Open the requirements.txt file. Paste the following lines at the end of the file.
requests google-cloud-bigquery google-cloud-storage
-
Set the Entry Point to
get_filtered_data
. Open the
main.py
file.Paste the following python script. Change the values in the section marked
# Required vars to update
to the values for your environment. Update the values for the following lines:INTEGRATION_ID
- Cost management integration_id
BUCKET
- Filtered data GCP Bucket
PROJECT_ID
- Your project ID
DATASET
- Your dataset name
TABLE_ID
- Your table ID
- Click .
2.9. Trigger your function to post filtered data to your storage bucket
Create a scheduler job to run the function you created to send filtered data to Red Hat on a schedule.
Procedure
Copy the Trigger URL for the function you created to post the cost reports. You will need to add it to the Google Cloud Scheduler.
-
In the Google Cloud Console, search for
functions
and select the Cloud Functions result. - On the Cloud Functions page, select your function, and click the Trigger tab.
- In the HTTP section, click .
-
In the Google Cloud Console, search for
-
Create the scheduler job. In the Google Cloud Console, search for
cloud scheduler
and select the Cloud Scheduler result. Click
.-
Name your scheduler job. In this example, use
CustomerFilteredDataSchedule
. -
In the Frequency field, set the cron expression for when you want the function to run. In this example, use
0 9 * * *
to run the function daily at 9 AM. - Set the time zone and click .
-
Name your scheduler job. In this example, use
Configure the execution on the next page.
- In the Target type field, select HTTP.
- In the URL field, paste the Trigger URL you copied.
In the body field, paste the following code that passes into the function to trigger it.
{"name": "Scheduler"}
- In the Auth header field, select Add OIDC token.
- Click the Service account field and click to create a service account and role for the scheduler job.
In the Service account details step, name your service account. In this example, use
scheduler-service-account
. Accept the default Service account ID and click .- In the Grant this service account access to project field, search for and select Cloud Scheduler Job Runner as the first role.
- Click Cloud Functions Invoker. , then search for and select
- Click .
- Click to finish creating the service account.
- Go back to the Cloud scheduler tab.
- In the Configure the execution page, select the Service account field.
- Refresh the page and select the scheduler you just created.
- Click and then click .
After completing these steps, you have successfully set up your Google Cloud function to send reports to Red Hat. For next steps, refer to Chapter 3, Next steps for managing your costs.
2.10. Creating additional cloud functions to collect finalized data
At the beginning of the month, Google Cloud finalizes the bill for the month before. Create an additional function and scheduled job to trigger it to send these reports to Red Hat so cost management can process them.
Procedure
Set up a function to post reports:
- From Cloud Functions, select Create function.
- Name your function.
- Select HTTP trigger.
In Runtime, build, connections, security settings, click .
- Click .
- Select Exposed as environment variable.
- Select Secret version or Latest.
- Click .
- Repeat the process for your other secrets.
- Click .
- Copy your Trigger URL. Click .
- Select the latest Python runtime.
-
Set Entry point to
get_filtered_data
. -
Add your Google Cloud function. Update the values for
INTEGRATION_ID
,BUCKET
,PROJECT_ID
,DATASET
, andTABLE_ID
. Remove the comments from the following lines:
# month_end = now.replace(day=1) - timedelta(days=1) # delta = now.replace(day=1) - timedelta(days=query_range) # year = month_end.strftime("%Y") # month = month_end.strftime("%m") # day = month_end.strftime("%d")
-
Select the
requirements.py
file and add the requirements from the requirements.txt file. - Click .
Set up a cloud scheduler to trigger your function:
- Go to Cloud Scheduler.
- Click .
- Name your schedule
-
Set the frequency. For example, the following cron will run the job on the fourth day of every month,
0 9 4 * *
- Set a Time zone.
- Click .
- Paste the function Trigger URL you copied earlier.
-
In the request body, add
{"name": "Scheduler"}
. - Set the auth header to OIDC token.
- Select or create a service account with the Cloud Scheudler Job Runner and Cloud Functions Invoker roles.
- Click .
- Click .
Chapter 3. Next steps for managing your costs
After adding your OpenShift Container Platform and Google Cloud integration, on the cost management Overview page, your cost data is sorted into OpenShift and Infrastructure tabs. Select Perspective to toggle through different views of your cost data.
You can also use the global navigation menu to view additional details about your costs by cloud provider.
Additional Resources
3.1. Limiting access to cost management resources
After you add and configure integrations in cost management, you can limit access to cost data and resources.
You might not want users to have access to all of your cost data. Instead, you can grant users access only to data that is specific to their projects or organizations. With role-based access control, you can limit the visibility of resources in cost management reports. For example, you can restrict a user’s view to only AWS integrations, rather than the entire environment.
To learn how to limit access, see the more in-depth guide Limiting access to cost management resources.
3.2. Configuring tagging for your integrations
The cost management application tracks cloud and infrastructure costs with tags. Tags are also known as labels in OpenShift.
You can refine tags in cost management to filter and attribute resources, organize your resources by cost, and allocate costs to different parts of your cloud infrastructure.
You can only configure tags and labels directly on an integration. You can choose the tags that you activate in cost management, however, you cannot edit tags and labels in the cost management application.
To learn more about the following topics, see Managing cost data using tagging:
- Planning your tagging strategy to organize your view of cost data
- Understanding how cost management associates tags
- Configuring tags and labels on your integrations
3.3. Configuring cost models to accurately report costs
Now that you configured your integrations to collect cost and usage data in cost management, you can configure cost models to associate prices to metrics and usage.
A cost model is a framework that uses raw costs and metrics to define calculations for the costs in cost management. You can record, categorize, and distribute the costs that the cost model generates to specific customers, business units, or projects.
In Cost Models, you can complete the following tasks:
- Classifying your costs as infrastructure or supplementary costs
- Capturing monthly costs for OpenShift nodes and clusters
- Applying a markup to account for additional support costs
To learn how to configure a cost model, see Using cost models.
3.4. Visualizing your costs with Cost Explorer
Use cost management Cost Explorer to create custom graphs of time-scaled cost and usage information and ultimately better visualize and interpret your costs.
To learn more about the following topics, see Visualizing your costs using Cost Explorer:
- Using Cost Explorer to identify abnormal events
- Understanding how your cost data changes over time
- Creating custom bar charts of your cost and usage data
- Exporting custom cost data tables
Providing feedback on Red Hat documentation
We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.
Prerequisites
- You are logged in to the Red Hat Customer Portal.
Procedure
To provide feedback, perform the following steps:
- Click the following link: Create Issue.
- Describe the issue or enhancement in the Summary text box.
- Provide details about the issue or requested enhancement in the Description text box.
- Type your name in the Reporter text box.
- Click the Create button.
This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.