Rechercher

Ce contenu n'est pas disponible dans la langue sélectionnée.

Chapter 2. Filtering your Microsoft Azure data before integrating it into cost management

download PDF

To share a subset of your billing data with RH, you can configure a function script in Microsoft Azure. This script copies exports an object storage bucket that cost management can then access and filter.

To integrate your Microsoft Azure account:

  1. Create a storage account and resource group.
  2. Configure Storage Account Contributor and Reader roles for access.
  3. Create a function to filter the data you want to send to Red Hat.
  4. Schedule daily cost exports to a storage account accessible to Red Hat.
Note

Because third-party products and documentation can change, instructions for configuring the third-party integrations provided are general and correct at the time of publishing. For the most up-to-date information, see the Microsoft Azure documentation.

Add your Microsoft Azure integration to cost management from the Integrations page.

2.1. Adding a Microsoft Azure account and naming your integration

Add your Microsoft Azure account as an integration so cost management can process the cost and usage data.

Procedure

  1. From Red Hat Hybrid Cloud Console, click Settings Menu Settings icon > Integrations.
  2. On the Settings page, in the Cloud tab, click Add integration.
  3. In the Cloud tab, click Add integration.
  4. In the Add a cloud integration wizard, select Microsoft Azure as the cloud provider type and click Next.
  5. Enter a name for your integration and click Next.
  6. In the Select application step, select Cost management and click Next.
  7. In the Specify cost export scope step, select I wish to manually customize the data set sent to Cost Management and click Next.

2.2. Creating a Microsoft Azure resource group and storage account

Create a storage account in Microsoft Azure to house your billing exports and a second storage account to house your filtered data so that cost management can collect the information. In the Add a cloud integration wizard in cost management, enter the resource group name and storage account name in the fields in the Resource group and storage account page.

Prerequisites

You must have a Red Hat user account with Cloud Administrator entitlements.

Procedure

  1. In your Microsoft Azure account, search for storage and click Storage accounts.

    1. On the Storage accounts page, click Create.
    2. In the Resource Group field, click Create new. Enter a name, and click OK. In this example, use filtered-data-group.
    3. In the Instance details section, enter a name in the Storage account name field. For example, use filtereddata.
    4. Copy the names of the resource group and storage account so you can add them to the Add a cloud integration wizard in Red Hat Hybrid Cloud Console and click Review.
    5. Review the storage account and click Create.
  2. In the Red Hat Hybrid Cloud Console Add a cloud integration wizard, on the Resource group and storage account page, enter values in the Resource group name and Storage account name.
  3. Click Next.

2.3. Finding your Microsoft Azure subscription ID

Find your subscription_id in the Microsoft Azure Cloud Shell and add it to the Add a cloud integration wizard in cost management.

Procedure

  1. In your Microsoft Azure account, click Cloud Shell.
  2. Enter the following command to obtain your Subscription ID:

    az account show --query "{subscription_id: id }"
  3. Copy the value for the subscription_id from the returned data.

    Example response

    {
        "subscription_id": 00000000-0000-0000-000000000000
        }

  4. Paste that value in the Subscription ID field on the Subscription ID page in the Add a cloud integration wizard.
  5. Click Next.

2.4. Creating Microsoft Azure roles for your storage account

Use the Microsoft Azure Cloud Shell to find your Tenant (Directory) ID, Client (Application) ID, and Client secret.

Procedure

  1. In your Microsoft Azure account, click Cloud Shell.
  2. Enter the following command to get your client ID, secret, and tenant name. Replace the values with your subscription ID from the last step and resourceGroup1 with the resource group name you created before. In this example, use filtered-data-group.

    az ad sp create-for-rbac -n "CostManagement" --role "Storage Account Contributor"  --scope /subscriptions/{subscriptionId}/resourceGroups/{resourceGroup1} --query '{"tenant": tenant, "client_id": appId, "secret": password}'
  3. Copy the values from the returned data for the client_id, secret, and tenant.

    Example response

    {
        "client_id": "00000000-0000-0000-000000000000",
        "secret": "00000000-0000-0000-000000000000",
        "tenant": "00000000-0000-0000-000000000000"
    }

  4. Paste the values of client_id`, secret, and `tenant in the Roles step in the Add a cloud integration wizard in Red Hat Hybrid Cloud Console.
  5. Run the following command in the Cloud shell to create a Cost Management Reader role and replace {Client ID} with the value from the previous step.

    az role assignment create --assignee {Client_ID} --role "Cost Management Reader"
  6. Click Next.

2.5. Creating a daily export in Microsoft Azure

Create a function in Microsoft Azure to filter your data and export it on a regular schedule. Exports create a recurring task that sends your Microsoft Azure cost data regularly to a storage account, which exists within a resource group. Cost management must be able to access the resource group to read the Microsoft Azure cost data. This example uses a Python function to filter the data and post it to the storage account you created earlier.

Procedure

  1. To create the export, go to the Portal menu in Microsoft Azure and click Cost Management + Billing.
  2. On the Cost Management + Billing page, click Cost Management.
  3. In the Settings menu, in the Cost management overview page, click, Exports.
  4. To add an export, click Add.
  5. In the Export details section, name the export.
  6. In the Storage section, add the resource group you created.

2.6. Creating a function in Microsoft Azure to filter your data

Create the function that filters your data and adds it to the storage account that you created to share with Red Hat. You can use the example Python script to gather the cost data from your cost exports related to your Red Hat expenses and add it to the storage account.

Prerequisites

  • You must have Visual Studio Code installed on your device.
  • You must have the Microsoft Azure functions extension installed in Visual Studio Code.

Procedure

  1. Log in to your Microsoft Azure account.
  2. Enter functions in the search bar, select Functions, and click Create.
  3. Select a hosting option for your function and click Select.
  4. On the Create Function App page, configure your function app by adding your resource group. =

    1. In the Instance Details section, name your function app.
    2. In Runtime stack, select Python
    3. In Version, select 3.10.
  5. Click Review + create:

    1. Click Create.
    2. Click Go to resource to configure the function.
  6. In the function app menu, click Functions to create a time trigger function:

    1. Click Create.
    2. In the development environment field, select VSCode.
  7. Open Visual Studio Code and ensure that the Microsoft Azure Functions Visual Studio Code extension is installed. To create an Azure function, Microsoft recommends that you use their Microsoft Visual Studio Code IDE to develop and deploy code. For more information about configuring Visual Studio Code, see Quickstart: Create a function in Azure with Python using Visual Studio Code .

    1. Click the Microsoft Azure tab in Visual Studio Code, sign in to Azure.
    2. In the workspaces tab in Visual Studio Code, click Create function.
    3. Follow the prompts to set a local location for your function and select a language and version for your function. In this example, select Python, for and select Python 3.9.
    4. In the Select a template for your project’s first function dialog, select Timer trigger, name the function, and press Enter
    5. Set the cron expression for when you want the function to run. In this example, use 0*9*** to run the function daily at 9 AM.
    6. Click Create.
  8. After you create the function in your development environment, open the requirements.txt file, add the following requirements, and save the file:

    azure-functions
    pandas
    requests
    azure-identity
    azure-storage-blob
  9. Open __init__.py and paste the following Python script. Change the values in the section marked # Required vars to update to the values for your environment. For the USER and PASS values, you can optionally use Key Vault Credentials to configure your username and password as environment variables.

    import datetime
    import logging
    import uuid
    import requests
    import pandas as pd
    from azure.identity import DefaultAzureCredential
    from azure.storage.blob import BlobServiceClient, ContainerClient
    
    import azure.functions as func
    
    
    def main(mytimer: func.TimerRequest) -> None:
        utc_timestamp = datetime.datetime.utcnow().replace(
            tzinfo=datetime.timezone.utc).isoformat()
    
        default_credential = DefaultAzureCredential()
    
        now = datetime.datetime.now()
        year = now.strftime("%Y")
        month = now.strftime("%m")
        day = now.strftime("%d")
        output_blob_name=f"{year}/{month}/{day}/{uuid.uuid4()}.csv"
    
        # Required vars to update
        USER = os.getenv('UsernameFromVault')                                   # Cost management username
        PASS = os.getenv('PasswordFromVault')                                   # Cost management password
        integration_id = "<your_integration_id>"                                          # Cost management integration_id
        cost_export_store = "https://<your-cost-export-storage-account>.blob.core.windows.net"           # Cost export storage account url
        cost_export_container = "<your-cost-export-container>"                                     # Cost export container
        filtered_data_store = "https://<your_filtered_data_container-storage-account>.blob.core.windows.net"         # Filtered data storage account url
        filtered_data_container = "<your_filtered_data_container>"                                   # Filtered data container
    
        # Create the BlobServiceClient object
        blob_service_client = BlobServiceClient(filtered_data_store, credential=default_credential)
        container_client = ContainerClient(cost_export_store, credential=default_credential, container_name=cost_export_container)
    
        blob_list = container_client.list_blobs()
        latest_blob = None
        for blob in blob_list:
            if latest_blob:
                if blob.last_modified > latest_blob.last_modified:
                    latest_blob = blob
            else:
                latest_blob = blob
    
        bc = container_client.get_blob_client(blob=latest_blob)
        data = bc.download_blob()
        blobjct = "/tmp/blob.csv"
        with open(blobjct, "wb") as f:
            data.readinto(f)
        df = pd.read_csv(blobjct)
    
        filtered_data = df.loc[((df["publisherType"] == "Marketplace") & ((df["publisherName"].astype(str).str.contains("Red Hat")) | (((df["publisherName"] == "Microsoft") | (df["publisherName"] == "Azure")) & (df['meterSubCategory'].astype(str).str.contains("Red Hat") | df['serviceInfo2'].astype(str).str.contains("Red Hat")))))]
    
        filtered_data_csv = filtered_data.to_csv (index_label="idx", encoding = "utf-8")
    
        blob_client = blob_service_client.get_blob_client(container=filtered_data_container, blob=output_blob_name)
    
        blob_client.upload_blob(filtered_data_csv, overwrite=True)
    
        # Post results to console.redhat.com API
        url = "https://console.redhat.com/api/cost-management/v1/ingress/reports/"
        json_data = {"source": integration_id, "reports_list": [f"{filtered_data_container}/{output_blob_name}"], "bill_year": year, "bill_month": month}
        resp = requests.post(url, json=json_data, auth=(USER, PASS))
        logging.info(f'Post result: {resp}')
    
        if mytimer.past_due:
            logging.info('The timer is past due!')
    
        logging.info('Python timer trigger function ran at %s', utc_timestamp)
  10. Save the file.
  11. Deploy the function to Microsoft Azure.

2.7. Configuring Microsoft Azure roles

Configure dedicated credentials to grant your function blob access to Microsoft Azure cost data so it can transfer the data from the original storage container to the filtered storage container.

Procedure

  1. In your Microsoft Azure account, type functions in the search bar.
  2. Find your function and select it.
  3. In the Settings menu, click Identity.
  4. On the Identity page, click Azure role assignments.
  5. On the Role assignments page, click Add role assignment.
  6. In the Scope field, select the Storage scope.
  7. In the Resource field, select the storage account that you created. In this example, use filtereddata.
  8. In the role field, select Storage Blob Data Contributor.
  9. Click Save.
  10. Repeat these steps to create a role for Storage Queue Data Contributor.
  11. Repeat this process for the other storage account that you created. In this example, use billingexportdata.
  12. In the Add a cloud integration wizard in Red Hat Hybrid Cloud Console, click Next.
  13. Review the information you provided in the wizard and click Add.
Red Hat logoGithubRedditYoutubeTwitter

Apprendre

Essayez, achetez et vendez

Communautés

À propos de la documentation Red Hat

Nous aidons les utilisateurs de Red Hat à innover et à atteindre leurs objectifs grâce à nos produits et services avec un contenu auquel ils peuvent faire confiance.

Rendre l’open source plus inclusif

Red Hat s'engage à remplacer le langage problématique dans notre code, notre documentation et nos propriétés Web. Pour plus de détails, consultez leBlog Red Hat.

À propos de Red Hat

Nous proposons des solutions renforcées qui facilitent le travail des entreprises sur plusieurs plates-formes et environnements, du centre de données central à la périphérie du réseau.

© 2024 Red Hat, Inc.