Integrating Google Cloud data into hybrid committed spend
Learn how to add and configure your Google Cloud integrations
Abstract
Preface
To add a Google Cloud account to hybrid committed spend, you must add it as a integration from the Red Hat Hybrid Cloud Console user interface and configure Google Cloud to provide metrics. You can send your data automatically, or configure a function script to copy the cost exports and object storage bucket that hybrid committed spend can access and filter your data to share a subset of your billing data with Red Hat.
Chapter 1. Creating a Google Cloud integration
To add a Google Cloud account to hybrid committed spend, you must configure your Google Cloud account to provide metrics, then add it as a integration from the Red Hat Hybrid Cloud Console user interface.
You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to hybrid committed spend.
To configure your Google Cloud account to be a hybrid committed spend integration, you must complete the following tasks:
- Create a Google Cloud project for your hybrid committed spend data.
- Create a bucket for filtered reports.
- Billing service account member with the correct role to export your data to hybrid committed spend.
- Create a BigQuery dataset to contain the cost data.
- Create a billing export that sends the hybrid committed spend data to your BigQuery dataset.
As you will complete some of the following steps in the Google Cloud Console, and some steps in the hybrid committed spend user interface, keep both applications open in a web browser.
Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.
Add your Google Cloud integrations to hybrid committed spend from the Integrations page.
1.1. Adding your Google Cloud account as an integration
You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the hybrid committed spend application processes the cost and usage data from your Google Cloud account and makes it viewable.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click Add integration.
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
- Enter a name for your integration. Click Next.
- In the Select application step, select Hybrid committed spend and click Next.
1.2. Creating a Google Cloud project
Create a Google Cloud project to gather and send your cost reports to hybrid committed spend.
Prerequisites
-
Access to Google Cloud Console with
resourcemanager.projects.create
permission
Procedure
- In the Google Cloud Console click → .
- Enter a Project name in the new page that appears and select your billing account.
- Select the Organization.
- Enter the parent organization in the Location box.
- Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Project page, enter your Project ID.
- To send the default data to Red Hat automatically, select I am OK with sending the default data set to hybrid committed spend and click Next.
Verification steps
- Navigate to the Google Cloud Console Dashboard
- Verify the project is in the menu bar.
Additional resources
- For additional information about creating projects, see the Google Cloud documentation Creating and managing projects.
1.3. Creating a Google Cloud Identity and Access Management role
A custom Identity and Access Management (IAM) role for hybrid committed spend gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.
Prerequisites
Access to Google Cloud Console with these permissions:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
-
- Google Cloud project
Procedure
- In the Google Cloud Console, click → .
- Select the hybrid committed spend project from the dropdown in the menu bar.
- Click .
-
Enter a Title, Description and ID for the role. In this example, use
customer-data-role
. - Click .
Use the Enter property name or value field to search and select these four permissions for your custom role:
-
bigquery.jobs.create
-
bigquery.tables.getData
-
bigquery.tables.get
-
bigquery.tables.list
-
- Click .
- Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Create IAM role page, click Next.
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
1.4. Adding a billing service account member to your Google Cloud project
You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Prerequisites
Procedure
- In the Google Cloud Console, click → .
- Select the hybrid committed spend project from the dropdown in the menu bar.
- Click .
Paste the IAM role you created into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
-
In the Assign roles section, assign the IAM role you created. In this example, use
customer-data-role
. - Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Assign access page, click Next.
Verification steps
- Navigate to → .
- Verify the new member is present with the correct role.
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
1.5. Creating a Google Cloud BigQuery dataset
Create a BigQuery dataset to collect and store the billing data for hybrid committed spend.
Prerequisites
-
Access to Google Cloud Console with
bigquery.datasets.create
permission - Google Cloud project
Procedure
- In Google Cloud Console, click → .
- Select the hybrid committed spend project in the Explorer panel.
- Click .
-
Enter a name for your dataset in the Dataset ID field. In this example, use
CustomerData
. - Click .
1.6. Exporting Google Cloud billing data to BigQuery
Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the hybrid committed spend BigQuery dataset.
Prerequisites
- Access to Google Cloud Console with the Billing Account Administrator role
- Google Cloud project
- Billing service member with the cost management Identity and Access Management (IAM) role
- BigQuery dataset
Procedure
- In the Google Cloud Console, click → .
- Click the Billing export tab.
- Click Detailed usage cost section. in the
- Select the hybrid committed spend Project and Billing export dataset you created in the dropdown menus.
- Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Billing export page, click Next.
- In the hybrid committed spend Add a cloud integration wizard, on the Review details page, click Add.
Verification steps
- Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.
Chapter 2. Integrating filtered Google Cloud data into hybrid committed spend
You can configure a function script in Google Cloud to copy the cost exports and object storage bucket that hybrid committed spend can access and filter your data to share a subset of your billing data with Red Hat.
You must have a Red Hat account user with Cloud Administrator permissions before you can add integrations to hybrid committed spend.
To configure your Google Cloud account to be a hybrid committed spend integration, you must complete the following tasks:
- Create a Google Cloud project for your hybrid committed spend data.
- Create a bucket for filtered reports.
- Have a billing service account member with the correct role to export your data to hybrid committed spend.
- Create a BigQuery dataset to contain the cost data.
- Create a billing export that sends the hybrid committed spend data to your BigQuery dataset.
Because you will complete some of the following steps in the Google Cloud Console, and some steps in the hybrid committed spend user interface, keep both applications open in a web browser.
Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Google Cloud Platform documentation.
Add your Google Cloud integration to hybrid committed spend from the Integrations page.
2.1. Adding your Google Cloud account as an integration
You can add your Google Cloud account as an integration. After adding a Google Cloud integration, the hybrid committed spend application processes the cost and usage data from your Google Cloud account and makes it viewable.
Prerequisites
- To add data integrations to cost management, you must have a Red Hat account with Cloud Administrator permissions.
Procedure
- From Red Hat Hybrid Cloud Console, click Settings Menu > Integrations.
- On the Settings page, in the Cloud tab, click Add integration.
- In the Add a cloud integration wizard, select Google Cloud as the cloud provider type and click Next.
- Enter a name for your integration. Click Next.
- In the Select application step, select Hybrid committed spend and click Next.
2.2. Creating a Google Cloud project
Create a Google Cloud project to gather and send your cost reports to hybrid committed spend.
Prerequisites
-
Access to Google Cloud Console with
resourcemanager.projects.create
permission
Procedure
- In the Google Cloud Console click → .
- Enter a Project name in the new page that appears and select your billing account.
- Select the Organization.
- Enter the parent organization in the Location box.
- Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Project page, enter your Project ID.
- To configure Google Cloud to filter your data before it sends the data to Red Hat, select I wish to manually customize the data set sent to hybrid committed spend, click Next.
Verification steps
- Navigate to the Google Cloud Console Dashboard
- Verify the project is in the menu bar.
Additional resources
- For additional information about creating projects, see the Google Cloud documentation Creating and managing projects.
2.3. Creating a Google Cloud bucket
Create a bucket for filtered reports that you will create later. Buckets are containers that store data.
Procedure
- In the Google Cloud Console, click Buckets.
- Click Create bucket.
-
Enter your bucket information. Name your bucket. In this example, use
customer-data
. - Click Create, then click Confirm in the confirmation dialog.
- In the hybrid committed spend Add a cloud integration wizard, on the Create cloud storage bucket page, enter your Cloud storage bucket name.
Additional resources
- For additional information about creating buckets, see the Google Cloud documentation on Creating buckets.
2.4. Creating a Google Cloud Identity and Access Management role
A custom Identity and Access Management (IAM) role for hybrid committed spend gives access to specific cost related resources required to enable a Google Cloud Platform integration and prohibits access to other resources.
Prerequisites
Access to Google Cloud Console with these permissions:
-
resourcemanager.projects.get
-
resourcemanager.projects.getIamPolicy
-
resourcemanager.projects.setIamPolicy
-
- Google Cloud project
Procedure
- In the Google Cloud Console, click → .
- Select the hybrid committed spend project from the dropdown in the menu bar.
- Click .
-
Enter a Title, Description and ID for the role. In this example, use
customer-data-role
. - Click .
Use the Enter property name or value field to search and select these four permissions for your custom role:
-
storage.objects.get
-
storage.objects.list
-
storage.buckets.get
-
- Click .
- Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Create IAM role page, click Next.
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
2.5. Adding a billing service account member to your Google Cloud project
You must create a billing service account member that can export cost reports to Red Hat Hybrid Cloud Console in your project.
Prerequisites
Procedure
- In the Google Cloud Console, click → .
- Select the hybrid committed spend project from the dropdown in the menu bar.
- Click .
Paste the IAM role you created into the New principals field:
billing-export@red-hat-cost-management.iam.gserviceaccount.com
-
In the Assign roles section, assign the IAM role you created. In this example, use
customer-data-role
. - Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Assign access page, click Next.
Verification steps
- Navigate to → .
- Verify the new member is present with the correct role.
Additional resources
- For additional information about roles and their usage, see the Google Cloud documentation Understanding roles and Creating and managing custom roles.
2.6. Creating a Google Cloud BigQuery dataset
Create a BigQuery dataset to collect and store the billing data for hybrid committed spend.
Prerequisites
-
Access to Google Cloud Console with
bigquery.datasets.create
permission - Google Cloud project
Procedure
- In Google Cloud Console, click → .
- Select the hybrid committed spend project in the Explorer panel.
- Click .
-
Enter a name for your dataset in the Dataset ID field. In this example, use
CustomerFilteredData
. - Click .
2.7. Exporting Google Cloud billing data to BigQuery
Enabling a billing export to BigQuery sends your Google Cloud billing data (such as usage, cost estimates, and pricing data) automatically to the hybrid committed spend BigQuery dataset.
Prerequisites
- Access to Google Cloud Console with the Billing Account Administrator role
- Google Cloud project
- Billing service member with the cost management Identity and Access Management (IAM) role
- BigQuery dataset
Procedure
- In the Google Cloud Console, click → .
- Click the Billing export tab.
- Click Detailed usage cost section. in the
- Select the hybrid committed spend Project and Billing export dataset you created in the dropdown menus.
- Click .
- In the hybrid committed spend Add a cloud integration wizard, on the Billing export page, click Next.
- In the hybrid committed spend Add a cloud integration wizard, on the Review details page, click Add.
Verification steps
- Verify a checkmark with Enabled in the Detailed usage cost section, with correct Project name and Dataset name.
2.8. Creating a function to post filtered data to your storage bucket
Create a function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account. This script filters the cost data you created with BigQuery, removes non-Red Hat information, then creates .csv
files, stores them in the bucket you created, and sends the data to Red Hat.
Procedure
In the Google Cloud Console, search for
secret
and select the Secret manager result to set up a secret to authenticate your function with Red Hat without storing your credentials in your function.- On the Secret Manager page, click Create Secret.
- Name your secret, add your Red Hat username, and click Create Secret.
- Repeat this process to save a secret for your Red Hat password.
-
In the Google Cloud Console search bar, search for
functions
and select the Cloud Functions result. - On the Cloud Functions page, click Create function.
-
Name the function. In this example, use
customer-data-function
. - In the Trigger section, click Save to accept the HTTP Trigger type.
- In the Runtime, build, connections and security settings, click the Security and image repository, reference the secrets you created, click Done, and click Next.
- On the Cloud Functions Code page, set the runtime to Python 3.9.
Open the
requirements.txt
file. Paste the following lines to the end of the file.requests google-cloud-bigquery google-cloud-storage
Open the
main.py
file.-
Set the Entry Point to
get_filtered_data
. Paste the following python script. Change the values in the section marked
# Required vars to update
to the values for your environment.import csv import datetime import uuid import os import requests from google.cloud import bigquery from google.cloud import storage from itertools import islice from dateutil.relativedelta import relativedelta query_range = 5 now = datetime.datetime.now() delta = now - relativedelta(days=query_range) year = now.strftime("%Y") month = now.strftime("%m") day = now.strftime("%d") report_prefix=f"{year}/{month}/{day}/{uuid.uuid4()}" # Required vars to update USER = os.getenv('username') # Cost management username PASS = os.getenv('password') # Cost management password INTEGRATION_ID = "<integration_id>" # Cost management integration_id BUCKET = "<bucket>" # Filtered data GCP Bucket PROJECT_ID = "<project_id>" # Your project ID DATASET = "<dataset>" # Your dataset name TABLE_ID = "<table_id>" # Your table ID gcp_big_query_columns = [ "billing_account_id", "service.id", "service.description", "sku.id", "sku.description", "usage_start_time", "usage_end_time", "project.id", "project.name", "project.labels", "project.ancestry_numbers", "labels", "system_labels", "location.location", "location.country", "location.region", "location.zone", "export_time", "cost", "currency", "currency_conversion_rate", "usage.amount", "usage.unit", "usage.amount_in_pricing_units", "usage.pricing_unit", "credits", "invoice.month", "cost_type", "resource.name", "resource.global_name", ] table_name = ".".join([PROJECT_ID, DATASET, TABLE_ID]) BATCH_SIZE = 200000 def batch(iterable, n): """Yields successive n-sized chunks from iterable""" it = iter(iterable) while chunk := tuple(islice(it, n)): yield chunk def build_query_select_statement(): """Helper to build query select statement.""" columns_list = gcp_big_query_columns.copy() columns_list = [ f"TO_JSON_STRING({col})" if col in ("labels", "system_labels", "project.labels", "credits") else col for col in columns_list ] columns_list.append("DATE(_PARTITIONTIME) as partition_date") return ",".join(columns_list) def create_reports(query_date): query = f"SELECT {build_query_select_statement()} FROM {table_name} WHERE DATE(_PARTITIONTIME) = {query_date} AND sku.description LIKE '%RedHat%' OR sku.description LIKE '%Red Hat%' OR service.description LIKE '%Red Hat%' ORDER BY usage_start_time" client = bigquery.Client() query_job = client.query(query).result() column_list = gcp_big_query_columns.copy() column_list.append("partition_date") daily_files = [] storage_client = storage.Client() bucket = storage_client.bucket(BUCKET) for i, rows in enumerate(batch(query_job, BATCH_SIZE)): csv_file = f"{report_prefix}/{query_date}_part_{str(i)}.csv" daily_files.append(csv_file) blob = bucket.blob(csv_file) with blob.open(mode='w') as f: writer = csv.writer(f) writer.writerow(column_list) writer.writerows(rows) return daily_files def post_data(files_list): # Post CSV's to console.redhat.com API url = "https://console.redhat.com/api/cost-management/v1/ingress/reports/" json_data = {"source": INTEGRATION_ID, "reports_list": files_list, "bill_year": year, "bill_month": month} resp = requests.post(url, json=json_data, auth=(USER, PASS)) return resp def get_filtered_data(request): files_list = [] query_dates = [delta + datetime.timedelta(days=x) for x in range(query_range)] for query_date in query_dates: files_list += create_reports(query_date.date()) resp = post_data(files_list) return f'Files posted! {resp}'
-
Set the Entry Point to
- Click Deploy.
2.9. Trigger your function to post filtered data to your storage bucket
Create a scheduler job to run the function you created to send filtered data to Red Hat on a schedule.
Procedure
Copy the Trigger URL for the function you created to post the cost reports. You will need to add it to the Google Cloud Scheduler.
-
In the Google Cloud Console, search for
functions
and select the Cloud Functions result. - On the Cloud Functions page, select your function, and click the Trigger tab.
- In the HTTP section, click Copy to clipboard.
-
In the Google Cloud Console, search for
-
Create the scheduler job. In the Google Cloud Console, search for
cloud scheduler
and select the Cloud Scheduler result. Click Create job.
-
Name your scheduler job. In this example, use
CustomerFilteredDataSchedule
. -
In the Frequency field, set the cron expression for when you want the function to run. In this example, use
09***
to run the function daily at 9 AM. - Set the timezone and click Continue.
-
Name your scheduler job. In this example, use
Configure the execution on the next page.
- In the Target type field, select HTTP.
- In the URL field, paste the Trigger URL you copied.
In the body field, paste the following code that passes into the function to trigger it.
{"name": "Scheduler"}
- In the Auth header field, select Add OIDC token.
- Click the Service account field and click Create to create a service account and role for the scheduler job.
In the Service account details step, name your service account. In this example, use
scheduler-service-account
. Accept the default Service account ID and click Create and Continue.- In the Grand this service account access to project, select two roles for your account.
-
Click ADD ANOTHER ROLE then search for and select
Cloud Scheduler Job Runner
and Cloud Functions Invoker. - Click Continue.
- Click Done to finish creating the service account.
-
On the Service accounts for your project page, select the scheduler job that you were working on. In this example, the name is
scheduler-service-account
. -
In the Configure the execution page, select the Service account field and select the
scheduler-service-account
you just created. - Click Continue and then click Create.
Providing feedback on Red Hat documentation
We appreciate and prioritize your feedback regarding our documentation. Provide as much detail as possible, so that your request can be quickly addressed.
Prerequisites
- You are logged in to the Red Hat Customer Portal.
Procedure
To provide feedback, perform the following steps:
- Click the following link: Create Issue.
- Describe the issue or enhancement in the Summary text box.
- Provide details about the issue or requested enhancement in the Description text box.
- Type your name in the Reporter text box.
- Click the Create button.
This action creates a documentation ticket and routes it to the appropriate documentation team. Thank you for taking the time to provide feedback.