Este conteúdo não está disponível no idioma selecionado.
Chapter 6. Deploying the Ansible Lightspeed intelligent assistant on OpenShift Container Platform
As a system administrator, you can deploy Ansible Lightspeed intelligent assistant on Ansible Automation Platform 2.5 on OpenShift Container Platform.
6.1. Overview Copiar o linkLink copiado para a área de transferência!
The Ansible Lightspeed intelligent assistant is available on Ansible Automation Platform 2.5 on OpenShift Container Platform as a Technology Preview release. It is an intuitive chat interface embedded within the Ansible Automation Platform, using generative artificial intelligence (AI) to answer questions about the Ansible Automation Platform.
The Ansible Lightspeed intelligent assistant interacts with users in their natural language prompts in English, and uses Large Language Models (LLMs) to generate quick, accurate, and personalized responses. These responses empower Ansible users to work more efficiently, thereby improving productivity and the overall quality of their work.
Ansible Lightspeed intelligent assistant requires the following configurations:
- Installation of Ansible Automation Platform 2.5 on Red Hat OpenShift Container Platform
- Deployment of an LLM served by either a Red Hat AI platform or a third-party AI platform. To know the LLM providers that you can use, see LLM providers.
- Red Hat does not collect any telemetry data from your interactions with the Ansible Lightspeed intelligent assistant.
Ansible Lightspeed intelligent assistant is available as a Technology Preview feature only.
Technology Preview features are not supported with Red Hat production service level agreements (SLAs) and might not be functionally complete. Red Hat does not recommend using them in production. These features provide early access to upcoming product features, enabling customers to test functionality and provide feedback during the development process.
For more information about the support scope of Red Hat Technology Preview features, see Technology Preview Features Support Scope.
6.2. Ansible Automation Platform 2.5 requirements Copiar o linkLink copiado para a área de transferência!
- You have installed Ansible Automation Platform 2.5 on your OpenShift Container Platform environment.
- You have administrator privileges for the Ansible Automation Platform.
- You have provisioned an OpenShift cluster with Operator Lifecycle Management installed.
6.3. Large Language Model (LLM) provider requirements Copiar o linkLink copiado para a área de transferência!
You must have configured an LLM provider that you will use before deploying the Ansible Lightspeed intelligent assistant.
An LLM is a type of machine learning model that can interpret and generate human-like language. When an LLM is used with the Ansible Lightspeed intelligent assistant, the LLM can interpret questions accurately and provide helpful answers in a conversational manner.
As part of the Technology Preview release, Ansible Lightspeed intelligent assistant can rely on the following Software as a Service (SaaS) LLM providers:
Red Hat LLM providers
Red Hat Enterprise Linux AI
Red Hat Enterprise Linux AI is OpenAI API-compatible and is configured in a similar manner to the OpenAI provider. You can configure Red Hat Enterprise Linux AI as the LLM provider. For more information, see the Red Hat Enterprise Linux AI product page.
Red Hat OpenShift AI
Red Hat OpenShift AI is OpenAI API-compatible and is configured in a similar manner to the OpenAI provider. You can configure Red Hat OpenShift AI as the LLM provider. For more information, see the Red Hat OpenShift AI product page.
For configurations with Red Hat Enterprise Linux AI or Red Hat OpenShift AI, you must host your own LLM provider instead of using a SaaS LLM provider.
Third-party LLM providers
IBM watsonx.ai
To use IBM watsonx with the Ansible Lightspeed intelligent assistant, you need an account with IBM watsonx.ai.
OpenAI
To use OpenAI with the Ansible Lightspeed intelligent assistant, you need access the OpenAI API platform.
Microsoft Azure OpenAI
To use Microsoft Azure with the Ansible Lightspeed intelligent assistant, you need access to Microsoft Azure OpenAI.
NoteMany self-hosted or self-managed model servers claim API compatibility with OpenAI. It is possible to configure the Ansible Lightspeed intelligent assistant OpenAI provider to point to an API-compatible model server. If the model server is truly API-compatible, especially with respect to authentication, then it might work. These configurations have not been tested by Red Hat, and issues related to their use are outside the scope of Technology Preview support.
6.4. Process for configuring and using the Ansible Lightspeed intelligent assistant Copiar o linkLink copiado para a área de transferência!
Perform the following tasks to set up and use the Ansible Lightspeed intelligent assistant in your Ansible Automation Platform instance on the OpenShift Container Platform environment:
Task | Description |
---|---|
Deploy the Ansible Lightspeed intelligent assistant on OpenShift Container Platform | An Ansible Automation Platform administrator who wants to deploy the Ansible Lightspeed intelligent assistant for all Ansible users in the organization. Perform the following tasks: |
Access and use the Ansible Lightspeed intelligent assistant | All Ansible users who want to use the intelligent assistant to get answers to their questions about the Ansible Automation Platform. For more details, see Using the Ansible Lightspeed intelligent assistant. |
6.5. Deploying the Ansible Lightspeed intelligent assistant Copiar o linkLink copiado para a área de transferência!
This section provides information about the procedures involved in deploying the Ansible Lightspeed intelligent assistant on OpenShift Container Platform.
6.5.1. Creating a chatbot configuration secret Copiar o linkLink copiado para a área de transferência!
Create a configuration secret for the Ansible Lightspeed intelligent assistant, so that you can connect the intelligent assistant to the Ansible Automation Platform operator.
Procedure
- Log in to Red Hat OpenShift Container Platform as an administrator.
-
Navigate to
. - From the Projects list, select the namespace that you created when you installed the Ansible Automation Platform operator.
-
Click
. -
In the Secret name field, enter a unique name for the secret. For example,
chatbot-configuration-secret
. Add the following keys and their associated values individually:
Expand Key Value Settings for all LLM setups
chatbot_model
Enter the LLM model name that is configured on your LLM setup.
chatbot_url
Enter the inference API base URL on your LLM setup. For example,
https://your_inference_api/v1
.chatbot_token
Enter the API token or the API key. This token is sent along with the authorization header when an inference API is called.
chatbot_llm_provider_type
Optional
Enter the provider type of your LLM setup by using one of the following values:
-
Red Hat Enterprise Linux AI:
rhelai_vllm
-
Red Hat OpenShift AI:
rhoai_vllm
(Default value) -
IBM watsonx.ai:
watsonx
-
OpenAI:
openai
-
Microsoft Azure OpenAI:
azure_openai
chatbot_context_window_size
Optional
Enter a value to configure the context window length for your LLM setup.
Default=
128000
chatbot_temperature_override
Optional
A lower temperature generates predictable results, while a higher temperature allows more diverse or creative responses.
Enter one of the following values:
-
0
: Least creativity and randomness in the responses. -
1
: Maximum creativity and randomness in the responses. null
: Override or disable the default temperature setting.NoteA few OpenAI o-series models (o1, o3-mini, and o4-mini models) do not support the temperature settings. Therefore, you must set the value to null to use these OpenAI models.
Additional setting for IBM watsonx.ai only
chatbot_llm_provider_project_id
Enter the project ID of your IBM watsonx setup.
Additional settings for Microsoft Azure OpenAI only
chatbot_azure_deployment_name
Enter the deployment name of your Microsoft Azure OpenAI setup.
chatbot_azure_api_version
Optional
Enter the API version of your Microsoft Azure OpenAI setup.
-
Red Hat Enterprise Linux AI:
- Click Create. The chatbot authorization secret is successfully created.
6.5.2. Updating the YAML file of the Ansible Automation Platform operator Copiar o linkLink copiado para a área de transferência!
After you create the chatbot authorization secret, you must update the YAML file of the Ansible Automation Platform operator to use the secret.
Procedure
- Log in to Red Hat OpenShift Container Platform as an administrator.
-
Navigate to
. - From the list of installed operators, select the Ansible Automation Platform operator.
- Locate and select the Ansible Automation Platform custom resource, and then click the required app.
- Select the YAML tab.
Scroll the text to find the
spec:
section, and add the following details under thespec:
section:spec: lightspeed: disabled: false chatbot_config_secret_name: <name of your chatbot configuration secret>
spec: lightspeed: disabled: false chatbot_config_secret_name: <name of your chatbot configuration secret>
Copy to Clipboard Copied! Toggle word wrap Toggle overflow - Click Save. The Ansible Lightspeed intelligent assistant service takes a few minutes to set up.
Verification
Verify that the chat interface service is running successfully:
-
Navigate to
. Filter with the term api and ensure that the following APIs are displayed in Running status:
-
myaap-lightspeed-api-<version number>
-
myaap-lightspeed-chatbot-api-<version number>
-
-
Navigate to
Verify that the chat interface is displayed on the Ansible Automation Platform:
Access the Ansible Automation Platform:
-
Navigate to
. - From the list of installed operators, click Ansible Automation Platform.
- Locate and select the Ansible Automation Platform custom resource, and then click the app that you created.
From the Details tab, record the information available in the following fields:
- URL: This is the URL of your Ansible Automation Platform instance.
- Gateway Admin User: This is the username to log into your Ansible Automation Platform instance.
- Gateway Admin password: This is the password to log into your Ansible Automation Platform instance.
- Log in to the Ansible Automation Platform using the URL, username, and password that you recorded earlier.
-
Navigate to
Access the Ansible Lightspeed intelligent assistant:
-
Click the Ansible Lightspeed intelligent assistant icon
that is displayed at the top right corner of the taskbar.
Verify that the chat interface is displayed, as shown in the following image:
.
-
Click the Ansible Lightspeed intelligent assistant icon
6.6. Using the Ansible Lightspeed intelligent assistant Copiar o linkLink copiado para a área de transferência!
After you deploy the Ansible Lightspeed intelligent assistant, all Ansible users within the organization can access and use the chat interface to ask questions and receive information about the Ansible Automation Platform.
6.6.1. Accessing the Ansible Lightspeed intelligent assistant Copiar o linkLink copiado para a área de transferência!
- Log in to the Ansible Automation Platform.
Click the Ansible Lightspeed intelligent assistant icon
that is displayed at the top right corner of the taskbar.
The Ansible Lightspeed intelligent assistant window opens with a welcome message, as shown in the following image:
6.6.2. Using the Ansible Lightspeed intelligent assistant Copiar o linkLink copiado para a área de transferência!
You can perform the following tasks:
- Ask questions in the prompt field and get answers about the Ansible Automation Platform
- View the chat history of all conversations in a chat session
Search the chat history using a user prompt or answer
The chat history is deleted when you close an existing chat session or log out from the Ansible Automation Platform.
- Restore a previous chat by clicking the relevant entry from the chat history
- Provide feedback on the quality of the chat answers, by clicking the Thumbs up or Thumbs down icon
- Copy and record the answers by clicking the Copy icon
-
Change the mode of the virtual assistant to dark or light mode, by clicking the Sun icon
from the top right corner of the toolbar
- Clear the context of an existing chat by using the New chat button in the chat history
- Close the chat interface while working on the Ansible Automation Platform