Dieser Inhalt ist in der von Ihnen ausgewählten Sprache nicht verfügbar.
Chapter 2. Filtering your Microsoft Azure data before integrating it into hybrid committed spend
To share a subset of your billing data with RH, you can configure a function script in Microsoft Azure. This script copies exports an object storage bucket that hybrid committed spend can then access and filter.
To integrate your Microsoft Azure account:
- Create a storage account and resource group.
- Configure Storage Account Contributor and Reader roles for access.
- Create a function to filter the data you want to send to Red Hat.
- Schedule daily cost exports to a storage account accessible to Red Hat.
If you are using RHEL metering, after you integrate your data with cost management, go to Adding RHEL metering to a Microsoft Azure integration to finish configuring your integration for RHEL metering.
Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Microsoft Azure’s documentation.
Add your Microsoft Azure integration to hybrid committed spend from the Integrations page.
2.1. Adding a Microsoft Azure account Link kopierenLink in die Zwischenablage kopiert!
Add your Microsoft Azure account as an integration so that hybrid committed spend can process the cost and usage data.
Prerequisites
- You must have a Red Hat user account with Cloud Administrator entitlements.
- You must have a service account.
- Your service account must have the correct roles assigned in Hybrid Cloud Console to enable cost management access. For more information, see the User Access Configuration Guide.
In cost management:
-
Click Settings Menu
> Integrations.
- In the Cloud tab, click .
- In the Add a cloud integration wizard, select Microsoft Azure and click .
- Enter a name for your integration and click .
- In the Select application step, select Hybrid committed spend and click .
In the Specify cost export scope step, select I wish to manually customize the data set sent to Cost Management.
- If you are registering RHEL usage billing, select Include RHEL usage. Otherwise, proceed to the next step.
- Click .
2.2. Creating a Microsoft Azure resource group and storage account Link kopierenLink in die Zwischenablage kopiert!
Create a storage account in Microsoft Azure to house your billing exports and a second storage account to house your filtered data.
In your Microsoft Azure account:
- In the search bar, enter "storage" and click .
- On the Storage accounts page, click .
-
In the Resource Group field, click . Enter a name and click . In this example, use
filtered-data-group. -
In the Instance details section, enter a name in the Storage account name field. For example, use
filtereddata. - Copy the names of the resource group and storage account so you can add them to the Add a cloud integration wizard in Red Hat Hybrid Cloud Console and click .
- Review the storage account and click .
In cost management:
- In the Add a cloud integration wizard, paste the resource group and storage account names that you copied into Resource group name and Storage account name.
You will continue using the wizard in the following sections.
2.3. Finding your Microsoft Azure subscription ID Link kopierenLink in die Zwischenablage kopiert!
Find your subscription_id in the Microsoft Azure Cloud Shell and add it to the Add a cloud integration wizard in hybrid committed spend.
In your Microsoft Azure account:
- Click .
Enter the following command to get your Subscription ID:
az account show --query "{subscription_id: id }"Copy the value that is generated for
subscription_id.Example response
{ "subscription_id": 00000000-0000-0000-000000000000 }
In cost management:
- In the Subscription ID field of the Add a cloud integration wizard, paste the value that you copied in the previous step.
- Click .
You will continue using the wizard in the following sections.
2.4. Creating Microsoft Azure roles for Red Hat access Link kopierenLink in die Zwischenablage kopiert!
To grant Red Hat access to your data, you must configure dedicated roles in Microsoft Azure. If you have an additional resource under the same Azure subscription, you might not need to create a new service account.
In cost management:
-
In the Roles section of the Add a cloud integration wizard, copy the
az ad sp create-for-rbaccommand to create a service principal with the Cost Management Storage Account Contributor role.
In your Microsoft Azure account:
- Click .
- In the cloud shell prompt, paste the command that you copied.
Copy the values that are generated for the client ID, secret, and tenant:
Example response
{ "client_id": "00000000-0000-0000-000000000000", "secret": "00000000-0000-0000-000000000000", "tenant": "00000000-0000-0000-000000000000" }
In cost management:
- Return to the Add a cloud integration wizard and paste the values that you copied into their corresponding fields on the Roles page.
- Click .
- Review your information and click to complete your integration.
- In the pop-up screen that appears, copy the Source UUID for your function script.
2.5. Creating a daily export in Microsoft Azure Link kopierenLink in die Zwischenablage kopiert!
Next, set up an automatic export of your cost data to your Microsoft Azure storage account before you filter it for cost management.
In your Microsoft Azure account:
- In the search bar, enter "cost exports" and click the result.
- Click .
- In Select a template, click Cost and usage (actual) to export your standard usage and purchase charges.
Follow the steps in the Azure wizard:
-
You can either create a new resource group and storage account or select existing ones. In this example, we use
billingexportdatafor the storage account andbillinggroupfor the resource group. - You must set Format to CSV.
- Set Compression type to None or Gzip.
-
You can either create a new resource group and storage account or select existing ones. In this example, we use
- Review the information and click .
In cost management:
- Return to the Add a cloud integration wizard and complete the steps in Daily export
- Click .
You will continue using the wizard in the following sections.
2.6. Creating a function in Microsoft Azure Link kopierenLink in die Zwischenablage kopiert!
Creating a function in Azure filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script in this section to gather and share the filtered cost data from your export.
Prerequisites
- You must have Visual Studio Code installed on your device.
- You must have the Microsoft Azure functions extension installed in Visual Studio Code. To create an Azure function, Microsoft recommends that you use their Microsoft Visual Studio Code IDE to develop and deploy code. For more information about configuring Visual Studio Code, see Quickstart: Create a function in Azure with Python using Visual Studio Code .
In your Microsoft Azure account:
-
Enter
functionsin the search bar and select Function App. - Click .
- Select a hosting option for your function and click .
On the Create Function App page, add your resource group.
- In the Instance Details section, name your function app.
- In Runtime stack, select Python.
- In Version, select latest.
Click Review + create:
- Click .
- Wait for the resource to be created and then click to view.
In Visual Studio Code:
Click the Microsoft Azure tab and sign in to Azure.
- In the Workspaces drop-down, click which appears as an icon with an orange lightning bolt.
- Click .
- Follow the prompts to set a local location and select a language and version for your function. In this example, select Python, Model 2 and the latest version available.
- In Select a template for your function dialog, select Timer trigger, name the function, and then press enter.
Set the cron expression to control when the function runs. In this example, use
0 9 * * *to run the function daily at 9 AM:- Click .
- Click .
In your requirements.txt file:
- After you create the function in your development environment, open the requirements.txt file, add the following requirements, and save the file:
azure-functions
pandas
requests
azure-identity
azure-storage-blob
In init.py:
- Copy the Python script and paste it into`init.py`.
Change the values in the section marked
# Required vars to updateto the values that correspond to your environment.-
The example script uses secrets from Azure Key Vaults to configure your service account
client_idandclient_secretas environment variables. You can alternatively enter your credentials directly into the script, although this is not best practice. The default script has built-in options for filtering your data or RHEL subscription filtering. You must uncomment the type of filtering you want to use or write your own custom filtering. Remove the comment from one of the following not both:
-
filtered_data = hcs_filtering(df) -
filtered_data = rhel_filtering(df)
-
If you want to write more customized filtering, you must include the following required columns:
'additionalinfo', 'billingaccountid', 'billingaccountname', 'billingcurrencycode', 'billingperiodenddate', 'billingperiodstartdate', 'chargetype', 'consumedservice', 'costinbillingcurrency', 'date', 'effectiveprice', 'metercategory', 'meterid', 'metername', 'meterregion', 'metersubcategory', 'offerid', 'productname', 'publishername', 'publishertype', 'quantity', 'reservationid', 'reservationname', 'resourcegroup', 'resourceid', 'resourcelocation', 'resourcename', 'servicefamily', 'serviceinfo1', 'serviceinfo2', 'subscriptionid', 'tags', 'unitofmeasure', 'unitprice'Some of the columns differ depending on the report type. The example script normalizes these columns and all filtered reports must follow this example.
column_translation = {"billingcurrency": "billingcurrencycode", "currency": "billingcurrencycode", "instanceid": "resourceid", "instancename": "resourceid", "pretaxcost": "costinbillingcurrency", "product": "productname", "resourcegroupname": "resourcegroup", "subscriptionguid": "subscriptionid", "servicename": "metercategory", "usage_quantity": "quantity"}To filter the data, you must add dataframe filtering. For example:
-
Exact matching: df.loc[(df["publishertype"] == "Marketplace")] Filters out all data that does not have a
publisherTypeof Marketplace. -
Contains: df.loc[df["publishername"].astype(str).str.contains("Red Hat")] Filters all data that does not contain Red Hat in the
publisherName. - You can stack filters by using & (for AND) and | (for OR) with your df.loc clause.
More useful filters:
subscriptionid- Filters specific subscriptions.
resourcegroup- Filters specific resource groups.
resourcelocation- Filters data in a specific region.
-
You can use
servicename,servicetier,metercategoryandmetersubcategoryto filter specific service types.
-
Exact matching: df.loc[(df["publishertype"] == "Marketplace")] Filters out all data that does not have a
-
The example script uses secrets from Azure Key Vaults to configure your service account
-
After you build your custom query, update the custom query in the example script under
# custom filtering basic example #. - Save the file.
In Visual Studio Code:
- Right click the Function window and click .
- Select the function app that you created in the previous steps.
2.7. Configuring function roles in Microsoft Azure Link kopierenLink in die Zwischenablage kopiert!
Configure dedicated credentials to grant your function blob access to Microsoft Azure cost data. These credentials enable your function to access, filter, and transfer the data from the original storage container to the filtered storage container.
In your Microsoft Azure account:
-
Enter
functionsin the search bar and select your function. - In the Settings menu, click .
Complete the following set of steps twice, one time for each of the two storage accounts that you created in the section Creating a Microsoft Azure resource group and storage account:
- Click .
- Click .
- In the Scope field, select Storage.
-
In the Resource field, select one of your two storage accounts. Our examples used
filtereddataandbillingeportdata. - In Role, select Storage Blob Data Contributor.
- Click .
- Click again.
- In the Scope field, select Storage.
- In the Resource field, select the same storage account again.
- This time, in Role, select Storage Queue Data Contributor.
- Click .
- Repeat this entire process for the other storage account that you created.
After completing these steps, you have successfully set up your Azure integration.