Chapter 6. Resolved issues


The following notable issues are resolved in Red Hat OpenShift AI 2.25. Security updates, bug fixes, and enhancements for Red Hat OpenShift AI 2.25 are released as asynchronous errata. All OpenShift AI errata advisories are published on the Red Hat Customer Portal.

6.1. Issues resolved in Red Hat OpenShift AI 2.25

RHOAIENG-9418 - Elyra raises error when you use parameters in uppercase

Previously, Elyra raised an error when you tried to run a pipeline that used parameters in uppercase. This issue is now resolved.

RHOAIENG-30493 - Error creating a workbench in a Kueue-enabled project

Previously, when using the dashboard to create a workbench in a Kueue-enabled project, the creation failed if Kueue was disabled on the cluster or if the selected hardware profile was not associated with a LocalQueue. In this case, the required LocalQueue could not be referenced, the admission webhook validation failed, and an error message was shown. This issue has been resolved.

RHOAIENG-32942 - Elyra requires unsupported filters on the REST API when pipeline store is Kubernetes

Before this update, when the pipeline store was configured to use Kubernetes, Elyra required equality (eq) filters that were not supported by the REST API. Only substring filters were supported in this mode. As a result, pipelines created and submitted through Elyra from a workbench could not run successfully. This issue has been resolved.

RHOAIENG-32897 - Pipelines defined with the Kubernetes API and invalid platformSpec do not appear in the UI or run

Before this update, when a pipeline version defined with the Kubernetes API included an empty or invalid spec.platformSpec field (for example, {} or missing the kubernetes key), the system misidentified the field as the pipeline specification. As a result, the REST API omitted the pipelineSpec, which prevented the pipeline version from being displayed in the UI and from running. This issue is now resolved.

RHOAIENG-31386 - Error deploying an Inference Service with authenticationRef

Before this update, when deploying an InferenceService with authenticationRef under external metrics, the authenticationRef field was removed. This issue is now resolved.

RHOAIENG-33914 - LM-Eval Tier2 task test failures

Previously, there could be failures with LM-Eval Tier2 task tests because the Massive Multitask Language Understanding Symbol Replacement (MMLUSR) tasks were broken. This issue is resolved witih the latest version of the trustyai-service-operator.

RHOAIENG-35532 - Unable to deploy models with HardwareProfiles and GPU

Before this update, the HardwareProfile to use GPU for model deployment had stopped working. The issue is now resolved.

RHOAIENG-4570 - Existing Argo Workflows installation conflicts with install or upgrade

Previously, installing or upgrading OpenShift AI on a cluster that already included an existing Argo Workflows instance could cause conflicts with the embedded Argo components deployed by Data Science Pipelines. This issue has been resolved. You can now configure OpenShift AI to use an existing Argo Workflows instance, enabling clusters that already run Argo Workflows to integrate with Data Science Pipelines without conflicts.

RHOAIENG-35623 - Model deployment fails when using hardware profiles

Previously, model deployments that used hardware profiles failed because the Red Hat OpenShift AI Operator did not inject the tolerations, nodeSelector, or identifiers from the hardware profile into the underlying InferenceService when manually creating InferenceService resources. As a result, the model deployment pods could not be scheduled to suitable nodes and the deployment fails to enter a ready state. This issue is now resolved.

Back to top
Red Hat logoGithubredditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust. Explore our recent updates.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Theme

© 2025 Red Hat