Search

Chapter 3. Technology Preview features

download PDF
Important

This section describes Technology Preview features in Red Hat OpenShift AI 2.14. Technology Preview features are not supported with Red Hat production service level agreements (SLAs) and might not be functionally complete. Red Hat does not recommend using them in production. These features provide early access to upcoming product features, enabling customers to test functionality and provide feedback during the development process.

For more information about the support scope of Red Hat Technology Preview features, see Technology Preview Features Support Scope.

RStudio Server notebook image

With the RStudio Server notebook image, you can access the RStudio IDE, an integrated development environment for R. The R programming language is used for statistical computing and graphics to support data analysis and predictions.

To use the RStudio Server notebook image, you must first build it by creating a secret and triggering the BuildConfig, and then enable it in the OpenShift AI UI by editing the rstudio-rhel9 image stream. For more information, see Building the RStudio Server workbench images.

Important

Disclaimer: Red Hat supports managing workbenches in OpenShift AI. However, Red Hat does not provide support for the RStudio software. RStudio Server is available through rstudio.org and is subject to their licensing terms. You should review their licensing terms before you use this sample workbench.

Data drift monitoring (TrustyAI)

With data drift monitoring, data scientists can detect significant changes in input data distributions for their deployed models, helping to ensure that model predictions remain accurate and reliable over time.

TrustyAI data drift monitoring metrics compare the latest real-world data to the original training data and provide a quantitative measure of the alignment between the training data and the inference data.

To use data drift monitoring, see Monitoring data drift in the Open Data Hub documentation.

CUDA - RStudio Server notebook image

With the CUDA - RStudio Server notebook image, you can access the RStudio IDE and NVIDIA CUDA Toolkit. The RStudio IDE is an integrated development environment for the R programming language for statistical computing and graphics. With the NVIDIA CUDA toolkit, you can enhance your work by using GPU-accelerated libraries and optimization tools.

To use the CUDA - RStudio Server notebook image, you must first build it by creating a secret and triggering the BuildConfig, and then enable it in the OpenShift AI UI by editing the rstudio-rhel9 image stream. For more information, see Building the RStudio Server workbench images.

Important

Disclaimer: Red Hat supports managing workbenches in OpenShift AI. However, Red Hat does not provide support for the RStudio software. RStudio Server is available through rstudio.org and is subject to their licensing terms. You should review their licensing terms before you use this sample workbench.

The CUDA - RStudio Server notebook image contains NVIDIA CUDA technology. CUDA licensing information is available in the CUDA Toolkit documentation. You should review their licensing terms before you use this sample workbench.

code-server workbench image

Red Hat OpenShift AI now includes the code-server workbench image. See code-server in GitHub for more information.

With the code-server workbench image, you can customize your workbench environment by using a variety of extensions to add new languages, themes, debuggers, and connect to additional services. You can also enhance the efficiency of your data science work with syntax highlighting, auto-indentation, and bracket matching.

Note

Elyra-based pipelines are not available with the code-server workbench image.

The code-server workbench image is currently available in Red Hat OpenShift AI 2.14 as a Technology Preview feature. This feature was first introduced in OpenShift AI 2.6.

Support for AMD GPUs
The AMD ROCm workbench image adds support for the AMD graphics processing units (GPU) Operator, significantly boosting the processing performance for compute-intensive activities. This provides you with access to drivers, development tools, and APIs that support AI workloads and a wide range of models. Additionally, the AMD ROCm workbench image includes machine learning libraries to support AI frameworks such as TensorFlow and PyTorch. The Technology Preview release also enables access to images that can be used to explore serving and training/tuning use cases with AMD GPUs. For more information, see Using AMD GPUs with workbenches in OpenShift AI.
Model Registry
OpenShift AI now supports the Model Registry Operator. The Model Registry Operator is not installed by default in Technology Preview mode. The model registry is a central repository that contains metadata related to machine learning models from inception to deployment.
NVIDIA NIM model serving platform
With the NVIDIA NIM model serving platform, you can deploy NVIDIA optimized models using NVIDIA NIM inference services in OpenShift AI. NVIDIA NIM, part of NVIDIA AI Enterprise, is a set of easy-to-use microservices designed for secure, reliable deployment of high-performance AI model inferencing across the cloud, data center and workstations. Supporting a wide range of AI models, including open-source community and NVIDIA AI Foundation models, it ensures seamless, scalable AI inferencing, on-premises or in the cloud, leveraging industry standard APIs. You need an NVIDIA AI Enterprise license key to enable the NVIDIA NIM model serving platform in OpenShift AI.
Red Hat logoGithubRedditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

© 2024 Red Hat, Inc.