Ce contenu n'est pas disponible dans la langue sélectionnée.
Chapter 2. Filtering your Microsoft Azure data before integrating it into cost management
To share a subset of your billing data with RH, you can configure a function script in Microsoft Azure. This script copies exports an object storage bucket that cost management can then access and filter.
To integrate your Microsoft Azure account:
- Create a storage account and resource group.
- Configure Storage Account Contributor and Reader roles for access.
- Create a function to filter the data you want to send to Red Hat.
- Schedule daily cost exports to a storage account accessible to Red Hat.
Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Microsoft Azure documentation.
Add your Microsoft Azure integration to cost management from the Integrations page.
2.1. Adding a Microsoft Azure account and naming your integration
Add your Microsoft Azure account as an integration so cost management can process the cost and usage data.
Procedure
-
From Red Hat Hybrid Cloud Console, click Settings Menu
> Integrations.
- On the Settings page, in the Cloud tab, click Add integration.
- In the Cloud tab, click Add integration.
- In the Add a cloud integration wizard, select Microsoft Azure as the cloud provider type and click Next.
- Enter a name for your integration and click Next.
- In the Select application step, select Cost management and click Next.
- In the Specify cost export scope step, select I wish to manually customize the data set sent to Cost Management and click Next.
2.2. Creating a Microsoft Azure resource group and storage account
Create a storage account in Microsoft Azure to house your billing exports and a second storage account to house your filtered data so that cost management can collect the information. In the Add a cloud integration wizard in cost management, enter the resource group name and storage account name in the fields in the Resource group and storage account page.
Prerequisites
You must have a Red Hat user account with Cloud Administrator entitlements.
Procedure
In your Microsoft Azure account, search for
storage
and click Storage accounts.- On the Storage accounts page, click Create.
-
In the Resource Group field, click Create new. Enter a name, and click OK. In this example, use
filtered-data-group
. -
In the Instance details section, enter a name in the Storage account name field. For example, use
filtereddata
. - Copy the names of the resource group and storage account so you can add them to the Add a cloud integration wizard in Red Hat Hybrid Cloud Console and click Review.
- Review the storage account and click Create.
- In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, on the Resource group and storage account page, enter values in the Resource group name and Storage account name.
- Click Next.
2.3. Finding your Microsoft Azure subscription ID
Find your subscription_id
in the Microsoft Azure Cloud Shell and add it to the Add a cloud integration wizard in cost management.
Procedure
- In your Microsoft Azure account, click Cloud Shell.
Enter the following command to obtain your Subscription ID:
az account show --query "{subscription_id: id }"
Copy the value for the
subscription_id
from the returned data.Example response
{ "subscription_id": 00000000-0000-0000-000000000000 }
- Paste that value in the Subscription ID field on the Subscription ID page in the Add a cloud integration wizard.
- Click Next.
2.4. Creating Microsoft Azure roles for your storage account
Use the Microsoft Azure Cloud Shell to find your Tenant (Directory) ID, Client (Application) ID, and Client secret.
Procedure
- In your Microsoft Azure account, click Cloud Shell.
Enter the following command to get your client ID, secret, and tenant name. Replace the values with your subscription ID from the last step and
resourceGroup1
with the resource group name you created before. In this example, usefiltered-data-group
.az ad sp create-for-rbac -n "CostManagement" --role "Storage Account Contributor" --scope /subscriptions/{subscriptionId}/resourceGroups/{resourceGroup1} --query '{"tenant": tenant, "client_id": appId, "secret": password}'
Copy the values from the returned data for the
client_id
,secret
, andtenant
.Example response
{ "client_id": "00000000-0000-0000-000000000000", "secret": "00000000-0000-0000-000000000000", "tenant": "00000000-0000-0000-000000000000" }
-
Paste the values of client_id`,
secret
, and `tenant in the Roles step in the Add a cloud integration wizard in Red Hat Hybrid Cloud Console. Run the following command in the Cloud shell to create a Cost Management Reader role and replace
{Client ID}
with the value from the previous step.az role assignment create --assignee {Client_ID} --role "Cost Management Reader"
- Click Next.
2.5. Creating a daily export in Microsoft Azure
Create a function in Microsoft Azure to filter your data and export it on a regular schedule. Exports create a recurring task that sends your Microsoft Azure cost data regularly to a storage account, which exists within a resource group. Cost management must be able to access the resource group to read the Microsoft Azure cost data. This example uses a Python function to filter the data and post it to the storage account you created earlier.
Procedure
- To create the export, go to the Portal menu in Microsoft Azure and click Cost Management + Billing.
- On the Cost Management + Billing page, click Cost Management.
- In the Settings menu, in the Cost management overview page, click, Exports.
- To add an export, click Add.
- In the Export details section, name the export.
- In the Storage section, add the resource group you created.
2.6. Creating a function in Microsoft Azure to filter your data
Create the function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account.
Prerequisites
- You must have Visual Studio Code installed on your device.
- You must have the Microsoft Azure functions extension installed in Visual Studio Code.
Procedure
- Log in to your Microsoft Azure account.
-
Enter
functions
in the search bar, select Functions, and click Create. - Select a hosting option for your function and click .
On the Create Function App page, configure your function app by adding your resource group. =
- In the Instance Details section, name your function app.
- In Runtime stack, select Python
- In Version, select 3.10.
Click Review + create:
- Click Create.
- Click Go to resource to configure the function.
In the function app menu, click Functions to create a time trigger function:
- Click Create.
- In the development environment field, select VSCode.
Open Visual Studio Code and ensure that the Microsoft Azure Functions Visual Studio Code extension is installed. To create an Azure function, Microsoft recommends that you use their Microsoft Visual Studio Code IDE to develop and deploy code. For more information about configuring Visual Studio Code, see Quickstart: Create a function in Azure with Python using Visual Studio Code .
- Click the Microsoft Azure tab in Visual Studio Code, sign in to Azure.
- In the workspaces tab in Visual Studio Code, click Create function.
- Follow the prompts to set a local location for your function and select a language and version for your function. In this example, select Python, for and select Python 3.9.
- In the Select a template for your project’s first function dialog, select Timer trigger, name the function, and press Enter
-
Set the cron expression for when you want the function to run. In this example, use
0*9***
to run the function daily at 9 AM. - Click Create.
After you create the function in your development environment, open the
requirements.txt
file, add the following requirements, and save the file:azure-functions pandas requests azure-identity azure-storage-blob
Open
__init__.py
and paste the following Python script. Change the values in the section marked# Required vars to update
to the values for your environment. For theUSER
andPASS
values, you can optionally use Key Vault Credentials to configure your username and password as environment variables.import datetime import logging import uuid import requests import pandas as pd from azure.identity import DefaultAzureCredential from azure.storage.blob import BlobServiceClient, ContainerClient import azure.functions as func def main(mytimer: func.TimerRequest) -> None: utc_timestamp = datetime.datetime.utcnow().replace( tzinfo=datetime.timezone.utc).isoformat() default_credential = DefaultAzureCredential() now = datetime.datetime.now() year = now.strftime("%Y") month = now.strftime("%m") day = now.strftime("%d") output_blob_name=f"{year}/{month}/{day}/{uuid.uuid4()}.csv" # Required vars to update USER = os.getenv('UsernameFromVault') # Cost management username PASS = os.getenv('PasswordFromVault') # Cost management password integration_id = "<your_integration_id>" # Cost management integration_id cost_export_store = "https://<your-cost-export-storage-account>.blob.core.windows.net" # Cost export storage account url cost_export_container = "<your-cost-export-container>" # Cost export container filtered_data_store = "https://<your_filtered_data_container-storage-account>.blob.core.windows.net" # Filtered data storage account url filtered_data_container = "<your_filtered_data_container>" # Filtered data container # Create the BlobServiceClient object blob_service_client = BlobServiceClient(filtered_data_store, credential=default_credential) container_client = ContainerClient(cost_export_store, credential=default_credential, container_name=cost_export_container) blob_list = container_client.list_blobs() latest_blob = None for blob in blob_list: if latest_blob: if blob.last_modified > latest_blob.last_modified: latest_blob = blob else: latest_blob = blob bc = container_client.get_blob_client(blob=latest_blob) data = bc.download_blob() blobjct = "/tmp/blob.csv" with open(blobjct, "wb") as f: data.readinto(f) df = pd.read_csv(blobjct) filtered_data = df.loc[((df["publisherType"] == "Marketplace") & ((df["publisherName"].astype(str).str.contains("Red Hat")) | (((df["publisherName"] == "Microsoft") | (df["publisherName"] == "Azure")) & (df['meterSubCategory'].astype(str).str.contains("Red Hat") | df['serviceInfo2'].astype(str).str.contains("Red Hat")))))] filtered_data_csv = filtered_data.to_csv (index_label="idx", encoding = "utf-8") blob_client = blob_service_client.get_blob_client(container=filtered_data_container, blob=output_blob_name) blob_client.upload_blob(filtered_data_csv, overwrite=True) # Post results to console.redhat.com API url = "https://console.redhat.com/api/cost-management/v1/ingress/reports/" json_data = {"source": integration_id, "reports_list": [f"{filtered_data_container}/{output_blob_name}"], "bill_year": year, "bill_month": month} resp = requests.post(url, json=json_data, auth=(USER, PASS)) logging.info(f'Post result: {resp}') if mytimer.past_due: logging.info('The timer is past due!') logging.info('Python timer trigger function ran at %s', utc_timestamp)
- Save the file.
- Deploy the function to Microsoft Azure.
2.7. Configuring Microsoft Azure roles
Configure dedicated credentials to grant your function blob access to Microsoft Azure cost data so it can transfer the data from the original storage container to the filtered storage container.
Procedure
-
In your Microsoft Azure account, type
functions
in the search bar. - Find your function and select it.
- In the Settings menu, click Identity.
- On the Identity page, click Azure role assignments.
- On the Role assignments page, click Add role assignment.
- In the Scope field, select the Storage scope.
-
In the Resource field, select the storage account that you created. In this example, use
filtereddata
. - In the role field, select Storage Blob Data Contributor.
- Click Save.
- Repeat these steps to create a role for Storage Queue Data Contributor.
-
Repeat this process for the other storage account that you created. In this example, use
billingexportdata
. - In the Add a cloud integration wizard in Red Hat Hybrid Cloud Console, click Next.
- Review the information you provided in the wizard and click Add.