이 콘텐츠는 선택한 언어로 제공되지 않습니다.

Chapter 4. Configuring pipelines with your own Argo Workflows instance


You can configure OpenShift AI to use an existing Argo Workflows instance instead of the embedded one included with Data Science Pipelines. This configuration is useful if your OpenShift cluster already includes a managed Argo Workflows instance and you want to integrate it with OpenShift AI pipelines without conflicts. Disabling the embedded Argo Workflows controller allows cluster administrators to manage the lifecycles of OpenShift AI and Argo Workflows independently.

Note

You cannot enable both the embedded Argo Workflows instance and your own Argo Workflows instance on the same cluster.

Prerequisites

  • You have cluster administrator privileges for your OpenShift cluster.
  • You have installed Red Hat OpenShift AI.

Procedure

  1. Log in to the OpenShift web console as a cluster administrator.
  2. In the OpenShift console, click Operators Installed Operators.
  3. Search for the Red Hat OpenShift AI Operator, and then click the Operator name to open the Operator details page.
  4. Click the Data Science Cluster tab.
  5. Click the default instance name (for example, default-dsc) to open the instance details page.
  6. Click the YAML tab to show the instance specifications.
  7. Disable the embedded Argo Workflows controllers that are managed by the OpenShift AI Operator:

    1. In the spec.components section, set the value of the managementState field for the datasciencepipelines component to Managed.
    2. In the spec.components.datasciencepipelines section, set the value of the managementState field for argoWorkflowsControllers to Removed, as shown in the following example:

      Example datasciencepipelines specification

      # ...
      spec:
        components:
          datasciencepipelines:
            argoWorkflowsControllers:
              managementState: Removed
            managementState: Managed
      # ...

  8. Click Save to apply your changes.
  9. Install and configure a compatible version of Argo Workflows on your cluster. For compatible version information, see Supported Configurations. For installation information, see the Argo Workflows Installation documentation.

Verification

  1. On the Details tab of the DataScienceCluster instance (for example, default-dsc), verify that DataSciencePipelinesReady has a Status of True.
  2. Verify that the ds-pipeline-workflow-controller pod does not exist:

    1. Go to Workloads Pods.
    2. Search for the ds-pipeline-workflow-controller pod.
    3. Verify that this pod does not exist. The absence of this pod confirms that the embedded Argo Workflows controller is disabled.
Red Hat logoGithubredditYoutubeTwitter

자세한 정보

평가판, 구매 및 판매

커뮤니티

Red Hat 문서 정보

Red Hat을 사용하는 고객은 신뢰할 수 있는 콘텐츠가 포함된 제품과 서비스를 통해 혁신하고 목표를 달성할 수 있습니다. 최신 업데이트를 확인하세요.

보다 포괄적 수용을 위한 오픈 소스 용어 교체

Red Hat은 코드, 문서, 웹 속성에서 문제가 있는 언어를 교체하기 위해 최선을 다하고 있습니다. 자세한 내용은 다음을 참조하세요.Red Hat 블로그.

Red Hat 소개

Red Hat은 기업이 핵심 데이터 센터에서 네트워크 에지에 이르기까지 플랫폼과 환경 전반에서 더 쉽게 작업할 수 있도록 강화된 솔루션을 제공합니다.

Theme

© 2026 Red Hat
맨 위로 이동