Install


Red Hat OpenShift Lightspeed 1.0

Installing OpenShift Lightspeed

Red Hat OpenShift Documentation Team

Abstract

This documentation provides information about installing OpenShift Lightspeed.

Chapter 1. Installing OpenShift Lightspeed

The installation process for Red Hat OpenShift Lightspeed consists of two main tasks: installing the Lightspeed Operator and configuring the Lightspeed Service to interact with the large language model (LLM) provider.

1.1. Large Language Model (LLM) overview

A large language model (LLM) is a type of artificial intelligence program trained on vast quantities of data. The OpenShift Lightspeed Service interacts with the LLM to generate answers to questions.

You can configure Red Hat Enterprise Linux AI or Red Hat OpenShift AI as the LLM provider for the OpenShift Lightspeed Service. Either LLM provider can use a server or inference service that processes inference queries.

Alternatively, you can connect the OpenShift Lightspeed Service to a publicly available LLM provider, such as IBM watsonx, OpenAI, or Microsoft Azure OpenAI.

Note

Configure the LLM provider before you install the OpenShift Lightspeed Operator. Installing the Operator does not install an LLM provider.

1.1.1. Red Hat Enterprise Linux AI with OpenShift Lightspeed

You can use Red Hat Enterprise Linux AI to host an LLM.

For more information, see Generating a custom LLM using RHEL AI.

1.1.2. Red Hat OpenShift AI with OpenShift Lightspeed

You can use Red Hat OpenShift AI to host an LLM.

For more information, see Single-model serving platform.

1.1.3. IBM watsonx with OpenShift Lightspeed

To configure IBM watsonx as the LLM provider, you need an IBM Cloud project with access to IBM watsonx. You also need your IBM watsonx API key.

For more information, see the official IBM watsonx product documentation.

1.1.4. OpenAI with OpenShift Lightspeed

To configure OpenAI as the LLM provider with OpenShift Lightspeed, you need either the OpenAI API key or the OpenAI project name during the configuration process.

The OpenAI Service has a feature for projects and service accounts. You can use a service account in a dedicated project so that you can precisely track OpenShift Lightspeed usage.

For more information, see the official OpenAI product documentation.

1.1.5. Microsoft Azure OpenAI with OpenShift Lightspeed

To configure Microsoft Azure OpenAI as the LLM provider, you need a Microsoft Azure OpenAI Service instance. You must have at least one model deployment in Microsoft Azure OpenAI Studio for that instance.

For more information, see the official Microsoft Azure OpenAI product documentation.

1.2. About subscription requirements

Red Hat OpenShift Lightspeed requires an active and valid subscription to one of the following products:

  • Red Hat OpenShift Kubernetes Engine, only supported for virtual machines
  • Red Hat OpenShift Virtualization Engine
  • OpenShift Container Platform
  • Red Hat OpenShift Platform Plus

1.3. Installing the OpenShift Lightspeed Operator

Prerequisites

  • You have deployed OpenShift Container Platform 4.15 or later. The cluster must be connected to the Internet and have telemetry enabled.
  • You are logged in to the OpenShift Container Platform web console as a user with the cluster-admin role.
  • You have access to the OpenShift CLI (oc).
  • You have successfully configured your Large Language Model (LLM) provider so that OpenShift Lightspeed can communicate with it.

Procedure

  1. In the OpenShift Container Platform web console, navigate to the OperatorsOperatorHub page.
  2. Search for Lightspeed.
  3. Locate the Lightspeed Operator, and click to select it.
  4. When the prompt that discusses the community operator appears, click Continue.
  5. Click Install.
  6. Use the default installation settings presented, and click Install to continue.
  7. Click OperatorsInstalled Operators to verify that the Lightspeed Operator is installed. Succeeded should appear in the Status column.

Legal Notice

Copyright © 2025 Red Hat, Inc.
The text of and illustrations in this document are licensed by Red Hat under a Creative Commons Attribution–Share Alike 3.0 Unported license ("CC-BY-SA"). An explanation of CC-BY-SA is available at http://creativecommons.org/licenses/by-sa/3.0/. In accordance with CC-BY-SA, if you distribute this document or an adaptation of it, you must provide the URL for the original version.
Red Hat, as the licensor of this document, waives the right to enforce, and agrees not to assert, Section 4d of CC-BY-SA to the fullest extent permitted by applicable law.
Red Hat, Red Hat Enterprise Linux, the Shadowman logo, the Red Hat logo, JBoss, OpenShift, Fedora, the Infinity logo, and RHCE are trademarks of Red Hat, Inc., registered in the United States and other countries.
Linux® is the registered trademark of Linus Torvalds in the United States and other countries.
Java® is a registered trademark of Oracle and/or its affiliates.
XFS® is a trademark of Silicon Graphics International Corp. or its subsidiaries in the United States and/or other countries.
MySQL® is a registered trademark of MySQL AB in the United States, the European Union and other countries.
Node.js® is an official trademark of Joyent. Red Hat is not formally related to or endorsed by the official Joyent Node.js open source or commercial project.
The OpenStack® Word Mark and OpenStack logo are either registered trademarks/service marks or trademarks/service marks of the OpenStack Foundation, in the United States and other countries and are used with the OpenStack Foundation's permission. We are not affiliated with, endorsed or sponsored by the OpenStack Foundation, or the OpenStack community.
All other trademarks are the property of their respective owners.
Back to top
Red Hat logoGithubredditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust. Explore our recent updates.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Theme

© 2025 Red Hat