此内容没有您所选择的语言版本。

Chapter 4. Configuring pipelines with your own Argo Workflows instance


You can configure OpenShift AI to use an existing Argo Workflows instance instead of the embedded one included with Data Science Pipelines. This configuration is useful if your OpenShift cluster already includes a managed Argo Workflows instance and you want to integrate it with OpenShift AI pipelines without conflicts. Disabling the embedded Argo Workflows controller allows cluster administrators to manage the lifecycles of OpenShift AI and Argo Workflows independently.

Note

You cannot enable both the embedded Argo Workflows instance and your own Argo Workflows instance on the same cluster.

Prerequisites

  • You have cluster administrator privileges for your OpenShift cluster.
  • You have installed Red Hat OpenShift AI.

Procedure

  1. Log in to the OpenShift web console as a cluster administrator.
  2. In the OpenShift console, click Operators Installed Operators.
  3. Search for the Red Hat OpenShift AI Operator, and then click the Operator name to open the Operator details page.
  4. Click the Data Science Cluster tab.
  5. Click the default instance name (for example, default-dsc) to open the instance details page.
  6. Click the YAML tab to show the instance specifications.
  7. Disable the embedded Argo Workflows controllers that are managed by the OpenShift AI Operator:

    1. In the spec.components section, set the value of the managementState field for the datasciencepipelines component to Managed.
    2. In the spec.components.datasciencepipelines section, set the value of the managementState field for argoWorkflowsControllers to Removed, as shown in the following example:

      Example datasciencepipelines specification

      # ...
      spec:
        components:
          datasciencepipelines:
            argoWorkflowsControllers:
              managementState: Removed
            managementState: Managed
      # ...

  8. Click Save to apply your changes.
  9. Install and configure a compatible version of Argo Workflows on your cluster. For compatible version information, see Supported Configurations. For installation information, see the Argo Workflows Installation documentation.

Verification

  1. On the Details tab of the DataScienceCluster instance (for example, default-dsc), verify that DataSciencePipelinesReady has a Status of True.
  2. Verify that the ds-pipeline-workflow-controller pod does not exist:

    1. Go to Workloads Pods.
    2. Search for the ds-pipeline-workflow-controller pod.
    3. Verify that this pod does not exist. The absence of this pod confirms that the embedded Argo Workflows controller is disabled.
Red Hat logoGithubredditYoutubeTwitter

学习

尝试、购买和销售

社区

关于红帽文档

通过我们的产品和服务,以及可以信赖的内容,帮助红帽用户创新并实现他们的目标。 了解我们当前的更新.

让开源更具包容性

红帽致力于替换我们的代码、文档和 Web 属性中存在问题的语言。欲了解更多详情,请参阅红帽博客.

關於紅帽

我们提供强化的解决方案,使企业能够更轻松地跨平台和环境(从核心数据中心到网络边缘)工作。

Theme

© 2026 Red Hat
返回顶部