Este contenido no está disponible en el idioma seleccionado.
Chapter 2. New features and enhancements
This section describes new features and enhancements in Red Hat OpenShift AI 2.24.
2.1. New features Copiar enlaceEnlace copiado en el portapapeles!
- Support added to TrustyAI for KServe InferenceLogger integration
TrustyAI now provides support for KServe inference deployments through automatic InferenceLogger configuration.
Both KServe Raw and Serverless are supported and deployment mode is automatically detected using the
InferenceService
annotations.
- Enhanced workload scheduling with Kueue
OpenShift AI now provides enhanced workload scheduling with the Red Hat build of Kueue. Kueue is a job-queuing system that provides resource-aware scheduling for workloads, improving GPU utilization and ensuring fair resource sharing with intelligent, quota-based scheduling across AI workloads.
This feature expands Kueue’s workload support in OpenShift AI to include workbenches (
Notebook
) and model serving (InferenceService
), in addition to the previously supported distributed training jobs (RayJob
,RayCluster
,PyTorchJob
).A validating webhook now handles queue enforcement. This webhook ensures that in any project enabled for Kueue management (with the
kueue.openshift.io/managed=true
label), all supported workloads must specify a targetLocalQueue
(with thekueue.x-k8s.io/queue-name
label). This replaces the Validating Admission Policy used in previous versions.For more information, see Managing workloads with Kueue.
- Support added to view git commit hash in an image
- You can now view the git commit hash in an image. This feature allows you to determine if the image has changed, even if the version number stays the same. You can also trace the image back to the source code if needed.
- Support added for data science pipelines with Elyra
- When using data science pipelines with Elyra, you now have the option to use a service-based URL rather than a route-based URL. Your data science pipeline can be used from the service directly by including the port number.
- Workbench images mirrored by default
The latest version of workbench images is mirrored by default when you mirror images to a private registry for a disconnected installation. As an administrator, you can mirror older versions of workbench images through the
additionalImages
field in the Disconnected Helper configuration.ImportantOnly the latest version of workbench images is supported with bug fixes and security updates. Older versions of workbench images are available, but they do not receive bug fixes or security updates.
2.2. Enhancements Copiar enlaceEnlace copiado en el portapapeles!
- Support for serving models from a Persistent Volume Claim (PVC)
- Red Hat OpenShift AI now supports serving models directly from existing cluster storage. With this feature, you can serve models from pre-existing persistent volume claim (PVC) locations and create new PVCs for model storage within the interface.
- New option to disable caching for all pipelines in a project
You can now disable caching for all data science pipelines in the pipeline server, which overrides all pipeline and task-level caching settings. This global setting is useful for scenarios such as debugging, development, or cases that require deterministic re-execution.
This option is configurable with the Allow caching to be configured per pipeline and task checkbox when you create or edit a pipeline server. Cluster administrators can also configure this
spec.apiServer.cacheEnabled
option. By default, this field is set to true. To disable caching cluster-wide, set this field to false. For more information, see Overview of data science pipelines caching.
- Migration of production images from Quay to Red Hat Registry
- RHOAI production images that fall under the current support model have been migrated from Quay to Red Hat Registry. They will continue to receive updates as defined by their release channel. Previously released images will remain in Quay.
- JupyterLab version updated
- JupyterLab is updated from version 4.2 to 4.4. This update includes a Move to Trash dropdown option when you right click on a folder, as well as other bug fixes and enhancements.
- Updated vLLM component versions
OpenShift AI supports the following vLLM versions for each listed component:
- vLLM CUDA: v0.10.0.2
- vLLM ROCm: v0.10.0.2
- vLLM Gaudi: v0.10.0.2
- vLLM Power/Z: v0.10.0.2
- Openvino Model Server: v2025.2.1
For more information, see vllm
in GitHub.