Dieser Inhalt ist in der von Ihnen ausgewählten Sprache nicht verfügbar.

Chapter 2. Filtering your Microsoft Azure data before integrating it into hybrid committed spend


To share a subset of your billing data with RH, you can configure a function script in Microsoft Azure. This script copies exports an object storage bucket that hybrid committed spend can then access and filter.

To integrate your Microsoft Azure account:

  1. Create a storage account and resource group.
  2. Configure Storage Account Contributor and Reader roles for access.
  3. Create a function to filter the data you want to send to Red Hat.
  4. Schedule daily cost exports to a storage account accessible to Red Hat.

If you are using RHEL metering, after you integrate your data with cost management, go to Adding RHEL metering to a Microsoft Azure integration to finish configuring your integration for RHEL metering.

Note

Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Microsoft Azure’s documentation.

Add your Microsoft Azure integration to hybrid committed spend from the Integrations page.

2.1. Adding a Microsoft Azure account

Add your Microsoft Azure account as an integration so that hybrid committed spend can process the cost and usage data.

Prerequisites

  • You must have a Red Hat user account with Cloud Administrator entitlements.
  • You must have a service account.
  • Your service account must have the correct roles assigned in Hybrid Cloud Console to enable cost management access. For more information, see the User Access Configuration Guide.

In cost management:

  1. Click Settings Menu Settings icon > Integrations.
  2. In the Cloud tab, click Add integration.
  3. In the Add a cloud integration wizard, select Microsoft Azure and click Next.
  4. Enter a name for your integration and click Next.
  5. In the Select application step, select Hybrid committed spend and click Next.
  6. In the Specify cost export scope step, select I wish to manually customize the data set sent to Cost Management.

    • If you are registering RHEL usage billing, select Include RHEL usage. Otherwise, proceed to the next step.
  7. Click Next.

2.2. Creating a Microsoft Azure resource group and storage account

Create a storage account in Microsoft Azure to house your billing exports and a second storage account to house your filtered data.

In your Microsoft Azure account:

  1. In the search bar, enter "storage" and click Storage accounts.
  2. On the Storage accounts page, click Create.
  3. In the Resource Group field, click Create new. Enter a name and click OK. In this example, use filtered-data-group.
  4. In the Instance details section, enter a name in the Storage account name field. For example, use filtereddata.
  5. Copy the names of the resource group and storage account so you can add them to the Add a cloud integration wizard in Red Hat Hybrid Cloud Console and click Review.
  6. Review the storage account and click Create.

In cost management:

  1. In the Add a cloud integration wizard, paste the resource group and storage account names that you copied into Resource group name and Storage account name.

You will continue using the wizard in the following sections.

2.3. Finding your Microsoft Azure subscription ID

Find your subscription_id in the Microsoft Azure Cloud Shell and add it to the Add a cloud integration wizard in hybrid committed spend.

In your Microsoft Azure account:

  1. Click Cloud Shell.
  2. Enter the following command to get your Subscription ID:

    az account show --query "{subscription_id: id }"
  3. Copy the value that is generated for subscription_id.

    Example response

    {
        "subscription_id": 00000000-0000-0000-000000000000
        }

In cost management:

  1. In the Subscription ID field of the Add a cloud integration wizard, paste the value that you copied in the previous step.
  2. Click Next.

You will continue using the wizard in the following sections.

2.4. Creating Microsoft Azure roles for Red Hat access

To grant Red Hat access to your data, you must configure dedicated roles in Microsoft Azure. If you have an additional resource under the same Azure subscription, you might not need to create a new service account.

In cost management:

  1. In the Roles section of the Add a cloud integration wizard, copy the az ad sp create-for-rbac command to create a service principal with the Cost Management Storage Account Contributor role.

In your Microsoft Azure account:

  1. Click Cloud Shell.
  2. In the cloud shell prompt, paste the command that you copied.
  3. Copy the values that are generated for the client ID, secret, and tenant:

    Example response

    {
        "client_id": "00000000-0000-0000-000000000000",
        "secret": "00000000-0000-0000-000000000000",
        "tenant": "00000000-0000-0000-000000000000"
    }

In cost management:

  1. Return to the Add a cloud integration wizard and paste the values that you copied into their corresponding fields on the Roles page.
  2. Click Next.
  3. Review your information and click Add to complete your integration.
  4. In the pop-up screen that appears, copy the Source UUID for your function script.

2.5. Creating a daily export in Microsoft Azure

Next, set up an automatic export of your cost data to your Microsoft Azure storage account before you filter it for cost management.

In your Microsoft Azure account:

  1. In the search bar, enter "cost exports" and click the result.
  2. Click Create.
  3. In Select a template, click Cost and usage (actual) to export your standard usage and purchase charges.
  4. Follow the steps in the Azure wizard:

    • You can either create a new resource group and storage account or select existing ones. In this example, we use billingexportdata for the storage account and billinggroup for the resource group.
    • You must set Format to CSV.
    • Set Compression type to None or Gzip.
  5. Review the information and click Create.

In cost management:

  1. Return to the Add a cloud integration wizard and complete the steps in Daily export
  2. Click Next.

You will continue using the wizard in the following sections.

2.6. Creating a function in Microsoft Azure

Creating a function in Azure filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script in this section to gather and share the filtered cost data from your export.

Prerequisites

  • You must have Visual Studio Code installed on your device.
  • You must have the Microsoft Azure functions extension installed in Visual Studio Code. To create an Azure function, Microsoft recommends that you use their Microsoft Visual Studio Code IDE to develop and deploy code. For more information about configuring Visual Studio Code, see Quickstart: Create a function in Azure with Python using Visual Studio Code .

In your Microsoft Azure account:

  1. Enter functions in the search bar and select Function App.
  2. Click Create.
  3. Select a hosting option for your function and click Select.
  4. On the Create Function App page, add your resource group.

    1. In the Instance Details section, name your function app.
    2. In Runtime stack, select Python.
    3. In Version, select latest.
  5. Click Review + create:

    1. Click Create.
    2. Wait for the resource to be created and then click Go to resource to view.

In Visual Studio Code:

  1. Click the Microsoft Azure tab and sign in to Azure.

    1. In the Workspaces drop-down, click Azure Functions which appears as an icon with an orange lightning bolt.
    2. Click Create Function.
  2. Follow the prompts to set a local location and select a language and version for your function. In this example, select Python, Model 2 and the latest version available.
  3. In Select a template for your function dialog, select Timer trigger, name the function, and then press enter.
  4. Set the cron expression to control when the function runs. In this example, use 0 9 * * * to run the function daily at 9 AM:

    1. Click Create.
    2. Click Open in the current window.

In your requirements.txt file:

  1. After you create the function in your development environment, open the requirements.txt file, add the following requirements, and save the file:
azure-functions
pandas
requests
azure-identity
azure-storage-blob

In init.py:

  1. Copy the Python script and paste it into`init.py`.
  2. Change the values in the section marked # Required vars to update to the values that correspond to your environment.

    • The example script uses secrets from Azure Key Vaults to configure your service account client_id and client_secret as environment variables. You can alternatively enter your credentials directly into the script, although this is not best practice.
    • The default script has built-in options for filtering your data or RHEL subscription filtering. You must uncomment the type of filtering you want to use or write your own custom filtering. Remove the comment from one of the following not both:

      • filtered_data = hcs_filtering(df)
      • filtered_data = rhel_filtering(df)
    • If you want to write more customized filtering, you must include the following required columns:

      'additionalinfo', 'billingaccountid', 'billingaccountname', 'billingcurrencycode', 'billingperiodenddate', 'billingperiodstartdate', 'chargetype', 'consumedservice', 'costinbillingcurrency', 'date', 'effectiveprice', 'metercategory', 'meterid', 'metername', 'meterregion', 'metersubcategory', 'offerid', 'productname', 'publishername', 'publishertype', 'quantity', 'reservationid', 'reservationname', 'resourcegroup', 'resourceid', 'resourcelocation', 'resourcename', 'servicefamily', 'serviceinfo1', 'serviceinfo2', 'subscriptionid', 'tags', 'unitofmeasure', 'unitprice'
    • Some of the columns differ depending on the report type. The example script normalizes these columns and all filtered reports must follow this example.

      column_translation = {"billingcurrency": "billingcurrencycode", "currency": "billingcurrencycode", "instanceid": "resourceid", "instancename": "resourceid", "pretaxcost": "costinbillingcurrency", "product": "productname", "resourcegroupname": "resourcegroup", "subscriptionguid": "subscriptionid", "servicename": "metercategory", "usage_quantity": "quantity"}
    • To filter the data, you must add dataframe filtering. For example:

      • Exact matching: df.loc[(df["publishertype"] == "Marketplace")] Filters out all data that does not have a publisherType of Marketplace.
      • Contains: df.loc[df["publishername"].astype(str).str.contains("Red Hat")] Filters all data that does not contain Red Hat in the publisherName.
      • You can stack filters by using & (for AND) and | (for OR) with your df.loc clause.
      • More useful filters:

        subscriptionid
        Filters specific subscriptions.
        resourcegroup
        Filters specific resource groups.
        resourcelocation
        Filters data in a specific region.
      • You can use servicename, servicetier, metercategory and metersubcategory to filter specific service types.
  3. After you build your custom query, update the custom query in the example script under # custom filtering basic example #.
  4. Save the file.

In Visual Studio Code:

  1. Right click the Function window and click Deploy to Function App.
  2. Select the function app that you created in the previous steps.

2.7. Configuring function roles in Microsoft Azure

Configure dedicated credentials to grant your function blob access to Microsoft Azure cost data. These credentials enable your function to access, filter, and transfer the data from the original storage container to the filtered storage container.

In your Microsoft Azure account:

  1. Enter functions in the search bar and select your function.
  2. In the Settings menu, click Identity.

Complete the following set of steps twice, one time for each of the two storage accounts that you created in the section Creating a Microsoft Azure resource group and storage account:

  1. Click Azure role assignments.
  2. Click Add role assignment.
  3. In the Scope field, select Storage.
  4. In the Resource field, select one of your two storage accounts. Our examples used filtereddata and billingeportdata.
  5. In Role, select Storage Blob Data Contributor.
  6. Click Save.
  7. Click Add role assignment again.
  8. In the Scope field, select Storage.
  9. In the Resource field, select the same storage account again.
  10. This time, in Role, select Storage Queue Data Contributor.
  11. Click Save.
  12. Repeat this entire process for the other storage account that you created.

After completing these steps, you have successfully set up your Azure integration.

Red Hat logoGithubredditYoutubeTwitter

Lernen

Testen, kaufen und verkaufen

Communitys

Über Red Hat Dokumentation

Wir helfen Red Hat Benutzern, mit unseren Produkten und Diensten innovativ zu sein und ihre Ziele zu erreichen – mit Inhalten, denen sie vertrauen können. Entdecken Sie unsere neuesten Updates.

Mehr Inklusion in Open Source

Red Hat hat sich verpflichtet, problematische Sprache in unserem Code, unserer Dokumentation und unseren Web-Eigenschaften zu ersetzen. Weitere Einzelheiten finden Sie in Red Hat Blog.

Über Red Hat

Wir liefern gehärtete Lösungen, die es Unternehmen leichter machen, plattform- und umgebungsübergreifend zu arbeiten, vom zentralen Rechenzentrum bis zum Netzwerkrand.

Theme

© 2026 Red Hat
Nach oben