Dieser Inhalt ist in der von Ihnen ausgewählten Sprache nicht verfügbar.
Chapter 3. Configuring pipelines with your own Argo Workflows instance
You can configure OpenShift AI to use an existing Argo Workflows instance instead of the embedded one included with Data Science Pipelines. This configuration is useful if your OpenShift cluster already includes a managed Argo Workflows instance and you want to integrate it with OpenShift AI pipelines without conflicts. Disabling the embedded Argo Workflows controller allows cluster administrators to manage the lifecycles of OpenShift AI and Argo Workflows independently.
You cannot enable both the embedded Argo Workflows instance and your own Argo Workflows instance on the same cluster.
Prerequisites
- You have cluster administrator privileges for your OpenShift cluster.
- You have installed Red Hat OpenShift AI.
Procedure
- Log in to the OpenShift web console as a cluster administrator.
-
In the OpenShift console, click Operators
Installed Operators. - Search for the Red Hat OpenShift AI Operator, and then click the Operator name to open the Operator details page.
- Click the Data Science Cluster tab.
- Click the default instance name (for example, default-dsc) to open the instance details page.
- Click the YAML tab to show the instance specifications.
Disable the embedded Argo Workflows controllers that are managed by the OpenShift AI Operator:
-
In the
spec.componentssection, set the value of themanagementStatefield for thedatasciencepipelinescomponent toManaged. In the
spec.components.datasciencepipelinessection, set the value of themanagementStatefield forargoWorkflowsControllerstoRemoved, as shown in the following example:Example datasciencepipelines specification
Copy to Clipboard Copied! Toggle word wrap Toggle overflow
-
In the
- Click Save to apply your changes.
- Install and configure a compatible version of Argo Workflows on your cluster. For compatible version information, see Supported Configurations. For installation information, see the Argo Workflows Installation documentation.
Verification
-
On the Details tab of the
DataScienceClusterinstance (for example, default-dsc), verify thatDataSciencePipelinesReadyhas a Status ofTrue. Verify that the
ds-pipeline-workflow-controllerpod does not exist:-
Go to Workloads
Pods. -
Search for the
ds-pipeline-workflow-controllerpod. - Verify that this pod does not exist. The absence of this pod confirms that the embedded Argo Workflows controller is disabled.
-
Go to Workloads