Search

Chapter 8. Installing Red Hat OpenShift AI components by using the web console

download PDF

The following procedure shows how to use the OpenShift Dedicated web console to install specific components of Red Hat OpenShift AI on your cluster.

Important

The following procedure describes how to create and configure a DataScienceCluster object to install Red Hat OpenShift AI components as part of a new installation. However, if you upgraded from version 1 of OpenShift AI (previously OpenShift Data Science), the upgrade process automatically created a default DataScienceCluster object. If you upgraded from a previous minor version, the upgrade process used the settings from the previous version’s DataScienceCluster object. To inspect the DataScienceCluster object and change the installation status of Red Hat OpenShift AI components, see Updating the installation status of Red Hat OpenShift AI components by using the web console.

Prerequisites

  • To support the KServe component, you installed dependent Operators, including the Red Hat OpenShift Serverless and Red Hat OpenShift Service Mesh Operators. For more information, see Serving large language models.
  • Red Hat OpenShift AI is installed as an Add-on to your Red Hat OpenShift cluster.
  • You have cluster administrator privileges for your OpenShift Dedicated cluster.

Procedure

  1. Log in to the OpenShift Dedicated web console as a cluster administrator.
  2. In the web console, click Operators Installed Operators and then click the Red Hat OpenShift AI Operator.
  3. Create a DataScienceCluster object to install OpenShift AI components by performing the following actions:

    1. Click the Data Science Cluster tab.
    2. Click Create DataScienceCluster.
    3. For Configure via, select YAML view.

      An embedded YAML editor opens showing a default custom resource (CR) for the DataScienceCluster object.

    4. In the spec.components section of the CR, for each OpenShift AI component shown, set the value of the managementState field to either Managed or Removed. These values are defined as follows:

      Managed
      The Operator actively manages the component, installs it, and tries to keep it active. The Operator will upgrade the component only if it is safe to do so.
      Removed
      The Operator actively manages the component but does not install it. If the component is already installed, the Operator will try to remove it.
      Important
      • To learn how to install the KServe component, which is used by the single model serving platform to serve large language models, see Serving large language models.
      • The CodeFlare and KubeRay components are Technology Preview features only. Technology Preview features are not supported with Red Hat production service level agreements (SLAs) and might not be functionally complete. Red Hat does not recommend using them in production. These features provide early access to upcoming product features, enabling customers to test functionality and provide feedback during the development process. For more information about the support scope of Red Hat Technology Preview features, see Technology Preview Features Support Scope.
      • To learn how to configure the distributed workloads feature that uses the CodeFlare and KubeRay components, see Configuring distributed workloads.
  4. Click Create.

Verification

  • On the DataScienceClusters page, click the default-dsc object and then perform the following actions:

    • Select the YAML tab.
    • In the installedComponents section, confirm that the components you installed have a status value of true.
  • In the OpenShift Dedicated web console, click Workloads Pods and then perform the following actions:

    • In the Project list at the top of the page, select the redhat-ods-applications project.
    • In the project, confirm that there are running pods for each of the OpenShift AI components that you installed.
Red Hat logoGithubRedditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

© 2024 Red Hat, Inc.