Chapter 4. Configuring pipelines with your own Argo Workflows instance


You can configure OpenShift AI to use an existing Argo Workflows instance instead of the embedded one included with AI pipelines. This configuration is useful if your OpenShift cluster already includes a managed Argo Workflows instance and you want to integrate it with OpenShift AI pipelines without conflicts. Disabling the embedded Argo Workflows controller allows cluster administrators to manage the lifecycles of OpenShift AI and Argo Workflows independently.

Note

You cannot enable both the embedded Argo Workflows instance and your own Argo Workflows instance on the same cluster.

Prerequisites

  • You have cluster administrator privileges for your OpenShift cluster.
  • You have installed Red Hat OpenShift AI.

Procedure

  1. Log in to the OpenShift web console as a cluster administrator.
  2. In the OpenShift console, click Operators Installed Operators.
  3. Search for the Red Hat OpenShift AI Operator, and then click the Operator name to open the Operator details page.
  4. Click the Data Science Cluster tab.
  5. Click the default instance name (for example, default-dsc) to open the instance details page.
  6. Click the YAML tab to show the instance specifications.
  7. Disable the embedded Argo Workflows controllers that are managed by the OpenShift AI Operator:

    1. In the spec.components section, set the value of the managementState field for the aipipelines component to Managed.
    2. In the spec.components.aipipelines section, set the value of the managementState field for argoWorkflowsControllers to Removed, as shown in the following example:

      Example aipipelines specification

      # ...
      spec:
        components:
          aipipelines:
            argoWorkflowsControllers:
              managementState: Removed
            managementState: Managed
      # ...
      Copy to Clipboard Toggle word wrap

  8. Click Save to apply your changes.
  9. Install and configure a compatible version of Argo Workflows on your cluster. For compatible version information, see Supported Configurations for 3.x. For installation information, see the Argo Workflows Installation documentation.

Verification

  1. On the Details tab of the DataScienceCluster instance (for example, default-dsc), verify that AIPipelinesReady has a Status of True.
  2. Verify that the ds-pipeline-workflow-controller pod does not exist:

    1. Go to Workloads Pods.
    2. Search for the ds-pipeline-workflow-controller pod.
    3. Verify that this pod does not exist. The absence of this pod confirms that the embedded Argo Workflows controller is disabled.
Red Hat logoGithubredditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust. Explore our recent updates.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Theme

© 2026 Red Hat
Back to top