Este conteúdo não está disponível no idioma selecionado.

Chapter 6. Deploying the Ansible Lightspeed intelligent assistant on OpenShift Container Platform


As a system administrator, you can deploy Ansible Lightspeed intelligent assistant on Ansible Automation Platform 2.5 on OpenShift Container Platform.

6.1. Overview

The Ansible Lightspeed intelligent assistant is available on Ansible Automation Platform 2.5 on OpenShift Container Platform as a Technology Preview release. It is an intuitive chat interface embedded within the Ansible Automation Platform, using generative artificial intelligence (AI) to answer questions about the Ansible Automation Platform.

The Ansible Lightspeed intelligent assistant interacts with users in their natural language prompts in English, and uses Large Language Models (LLMs) to generate quick, accurate, and personalized responses. These responses empower Ansible users to work more efficiently, thereby improving productivity and the overall quality of their work.

Ansible Lightspeed intelligent assistant requires the following configurations:

  • Installation of Ansible Automation Platform 2.5 on Red Hat OpenShift Container Platform
  • Deployment of an LLM served by either a Red Hat AI platform or a third-party AI platform. To know the LLM providers that you can use, see LLM providers.
Important
  • Red Hat does not collect any telemetry data from your interactions with the Ansible Lightspeed intelligent assistant.
  • Ansible Lightspeed intelligent assistant is available as a Technology Preview feature only.

    Technology Preview features are not supported with Red Hat production service level agreements (SLAs) and might not be functionally complete. Red Hat does not recommend using them in production. These features provide early access to upcoming product features, enabling customers to test functionality and provide feedback during the development process.

    For more information about the support scope of Red Hat Technology Preview features, see Technology Preview Features Support Scope.

6.2. Ansible Automation Platform 2.5 requirements

  • You have installed Ansible Automation Platform 2.5 on your OpenShift Container Platform environment.
  • You have administrator privileges for the Ansible Automation Platform.
  • You have provisioned an OpenShift cluster with Operator Lifecycle Management installed.

6.3. Large Language Model (LLM) provider requirements

You must have configured an LLM provider that you will use before deploying the Ansible Lightspeed intelligent assistant.

An LLM is a type of machine learning model that can interpret and generate human-like language. When an LLM is used with the Ansible Lightspeed intelligent assistant, the LLM can interpret questions accurately and provide helpful answers in a conversational manner.

As part of the Technology Preview release, Ansible Lightspeed intelligent assistant can rely on the following Software as a Service (SaaS) LLM providers:

Red Hat LLM providers

  • Red Hat Enterprise Linux AI

    Red Hat Enterprise Linux AI is OpenAI API-compatible and is configured in a similar manner to the OpenAI provider. You can configure Red Hat Enterprise Linux AI as the LLM provider. For more information, see the Red Hat Enterprise Linux AI product page.

  • Red Hat OpenShift AI

    Red Hat OpenShift AI is OpenAI API-compatible and is configured in a similar manner to the OpenAI provider. You can configure Red Hat OpenShift AI as the LLM provider. For more information, see the Red Hat OpenShift AI product page.

Note

For configurations with Red Hat Enterprise Linux AI or Red Hat OpenShift AI, you must host your own LLM provider instead of using a SaaS LLM provider.

Third-party LLM providers

  • IBM watsonx.ai

    To use IBM watsonx with the Ansible Lightspeed intelligent assistant, you need an account with IBM watsonx.ai.

  • OpenAI

    To use OpenAI with the Ansible Lightspeed intelligent assistant, you need access the OpenAI API platform.

  • Microsoft Azure OpenAI

    To use Microsoft Azure with the Ansible Lightspeed intelligent assistant, you need access to Microsoft Azure OpenAI.

    Note

    Many self-hosted or self-managed model servers claim API compatibility with OpenAI. It is possible to configure the Ansible Lightspeed intelligent assistant OpenAI provider to point to an API-compatible model server. If the model server is truly API-compatible, especially with respect to authentication, then it might work. These configurations have not been tested by Red Hat, and issues related to their use are outside the scope of Technology Preview support.

6.4. Process for configuring and using the Ansible Lightspeed intelligent assistant

Perform the following tasks to set up and use the Ansible Lightspeed intelligent assistant in your Ansible Automation Platform instance on the OpenShift Container Platform environment:

Expand
TaskDescription

Deploy the Ansible Lightspeed intelligent assistant on OpenShift Container Platform

An Ansible Automation Platform administrator who wants to deploy the Ansible Lightspeed intelligent assistant for all Ansible users in the organization.

Perform the following tasks:

Access and use the Ansible Lightspeed intelligent assistant

All Ansible users who want to use the intelligent assistant to get answers to their questions about the Ansible Automation Platform. For more details, see Using the Ansible Lightspeed intelligent assistant.

6.5. Deploying the Ansible Lightspeed intelligent assistant

This section provides information about the procedures involved in deploying the Ansible Lightspeed intelligent assistant on OpenShift Container Platform.

6.5.1. Creating a chatbot configuration secret

Create a configuration secret for the Ansible Lightspeed intelligent assistant, so that you can connect the intelligent assistant to the Ansible Automation Platform operator.

Procedure

  1. Log in to Red Hat OpenShift Container Platform as an administrator.
  2. Navigate to Workloads Secrets.
  3. From the Projects list, select the namespace that you created when you installed the Ansible Automation Platform operator.
  4. Click Create Key/value secret.
  5. In the Secret name field, enter a unique name for the secret. For example, chatbot-configuration-secret.
  6. Add the following keys and their associated values individually:

    Expand
    KeyValue

    Settings for all LLM setups

    chatbot_model

    Enter the LLM model name that is configured on your LLM setup.

    chatbot_url

    Enter the inference API base URL on your LLM setup. For example, https://your_inference_api/v1.

    chatbot_token

    Enter the API token or the API key. This token is sent along with the authorization header when an inference API is called.

    chatbot_llm_provider_type

    Optional

    Enter the provider type of your LLM setup by using one of the following values:

    • Red Hat Enterprise Linux AI: rhelai_vllm
    • Red Hat OpenShift AI: rhoai_vllm (Default value)
    • IBM watsonx.ai: watsonx
    • OpenAI: openai
    • Microsoft Azure OpenAI: azure_openai

    chatbot_context_window_size

    Optional

    Enter a value to configure the context window length for your LLM setup.

    Default= 128000

    chatbot_temperature_override

    Optional

    A lower temperature generates predictable results, while a higher temperature allows more diverse or creative responses.

    Enter one of the following values:

    • 0: Least creativity and randomness in the responses.
    • 1: Maximum creativity and randomness in the responses.
    • null: Override or disable the default temperature setting.

      Note

      A few OpenAI o-series models (o1, o3-mini, and o4-mini models) do not support the temperature settings. Therefore, you must set the value to null to use these OpenAI models.

    Additional setting for IBM watsonx.ai only

    chatbot_llm_provider_project_id

    Enter the project ID of your IBM watsonx setup.

    Additional settings for Microsoft Azure OpenAI only

    chatbot_azure_deployment_name

    Enter the deployment name of your Microsoft Azure OpenAI setup.

    chatbot_azure_api_version

    Optional

    Enter the API version of your Microsoft Azure OpenAI setup.

  7. Click Create. The chatbot authorization secret is successfully created.

6.5.2. Updating the YAML file of the Ansible Automation Platform operator

After you create the chatbot authorization secret, you must update the YAML file of the Ansible Automation Platform operator to use the secret.

Procedure

  1. Log in to Red Hat OpenShift Container Platform as an administrator.
  2. Navigate to Operators Installed Operators.
  3. From the list of installed operators, select the Ansible Automation Platform operator.
  4. Locate and select the Ansible Automation Platform custom resource, and then click the required app.
  5. Select the YAML tab.
  6. Scroll the text to find the spec: section, and add the following details under the spec: section:

    spec:
      lightspeed:
        disabled: false
        chatbot_config_secret_name: <name of your chatbot configuration secret>
    Copy to Clipboard Toggle word wrap
  7. Click Save. The Ansible Lightspeed intelligent assistant service takes a few minutes to set up.

Verification

  1. Verify that the chat interface service is running successfully:

    1. Navigate to Workloads Pods.
    2. Filter with the term api and ensure that the following APIs are displayed in Running status:

      • myaap-lightspeed-api-<version number>
      • myaap-lightspeed-chatbot-api-<version number>
  2. Verify that the chat interface is displayed on the Ansible Automation Platform:

    1. Access the Ansible Automation Platform:

      1. Navigate to Operators Installed Operators.
      2. From the list of installed operators, click Ansible Automation Platform.
      3. Locate and select the Ansible Automation Platform custom resource, and then click the app that you created.
      4. From the Details tab, record the information available in the following fields:

        • URL: This is the URL of your Ansible Automation Platform instance.
        • Gateway Admin User: This is the username to log into your Ansible Automation Platform instance.
        • Gateway Admin password: This is the password to log into your Ansible Automation Platform instance.
      5. Log in to the Ansible Automation Platform using the URL, username, and password that you recorded earlier.
    2. Access the Ansible Lightspeed intelligent assistant:

      1. Click the Ansible Lightspeed intelligent assistant icon Ansible Lightspeed intelligent assistant icon that is displayed at the top right corner of the taskbar.
      2. Verify that the chat interface is displayed, as shown in the following image:

        Ansible Lightspeed intelligent assistant .

6.6. Using the Ansible Lightspeed intelligent assistant

After you deploy the Ansible Lightspeed intelligent assistant, all Ansible users within the organization can access and use the chat interface to ask questions and receive information about the Ansible Automation Platform.

6.6.1. Accessing the Ansible Lightspeed intelligent assistant

  1. Log in to the Ansible Automation Platform.
  2. Click the Ansible Lightspeed intelligent assistant icon Ansible Lightspeed intelligent assistant icon that is displayed at the top right corner of the taskbar.

    The Ansible Lightspeed intelligent assistant window opens with a welcome message, as shown in the following image:

    Ansible Lightspeed intelligent assistant

6.6.2. Using the Ansible Lightspeed intelligent assistant

You can perform the following tasks:

  • Ask questions in the prompt field and get answers about the Ansible Automation Platform
  • View the chat history of all conversations in a chat session
  • Search the chat history using a user prompt or answer

    The chat history is deleted when you close an existing chat session or log out from the Ansible Automation Platform.

  • Restore a previous chat by clicking the relevant entry from the chat history
  • Provide feedback on the quality of the chat answers, by clicking the Thumbs up or Thumbs down icon
  • Copy and record the answers by clicking the Copy icon
  • Change the mode of the virtual assistant to dark or light mode, by clicking the Sun icon Sun icon from the top right corner of the toolbar
  • Clear the context of an existing chat by using the New chat button in the chat history
  • Close the chat interface while working on the Ansible Automation Platform
Voltar ao topo
Red Hat logoGithubredditYoutubeTwitter

Aprender

Experimente, compre e venda

Comunidades

Sobre a documentação da Red Hat

Ajudamos os usuários da Red Hat a inovar e atingir seus objetivos com nossos produtos e serviços com conteúdo em que podem confiar. Explore nossas atualizações recentes.

Tornando o open source mais inclusivo

A Red Hat está comprometida em substituir a linguagem problemática em nosso código, documentação e propriedades da web. Para mais detalhes veja o Blog da Red Hat.

Sobre a Red Hat

Fornecemos soluções robustas que facilitam o trabalho das empresas em plataformas e ambientes, desde o data center principal até a borda da rede.

Theme

© 2025 Red Hat