Dieser Inhalt ist in der von Ihnen ausgewählten Sprache nicht verfügbar.
Chapter 3. Migrating from Jenkins to OpenShift Pipelines or Tekton
You can migrate your CI/CD workflows from Jenkins to Red Hat OpenShift Pipelines, a cloud-native CI/CD experience based on the Tekton project.
3.1. Comparison of Jenkins and OpenShift Pipelines concepts
You can review and compare the following equivalent terms used in Jenkins and OpenShift Pipelines.
3.1.1. Jenkins terminology
Jenkins offers declarative and scripted pipelines that are extensible using shared libraries and plugins. Some basic terms in Jenkins are as follows:
- Pipeline: Automates the entire process of building, testing, and deploying applications by using Groovy syntax.
- Node: A machine capable of either orchestrating or executing a scripted pipeline.
- Stage: A conceptually distinct subset of tasks performed in a pipeline. Plugins or user interfaces often use this block to display the status or progress of tasks.
- Step: A single task that specifies the exact action to be taken, either by using a command or a script.
3.1.2. OpenShift Pipelines terminology
OpenShift Pipelines uses YAML syntax for declarative pipelines and consists of tasks. Some basic terms in OpenShift Pipelines are as follows:
- Pipeline: A set of tasks in a series, in parallel, or both.
- Task: A sequence of steps as commands, binaries, or scripts.
- PipelineRun: Execution of a pipeline with one or more tasks.
TaskRun: Execution of a task with one or more steps.
NoteYou can initiate a PipelineRun or a TaskRun with a set of inputs such as parameters and workspaces, and the execution results in a set of outputs and artifacts.
Workspace: In OpenShift Pipelines, workspaces are conceptual blocks that serve the following purposes:
- Storage of inputs, outputs, and build artifacts.
- Common space to share data among tasks.
- Mount points for credentials held in secrets, configurations held in config maps, and common tools shared by an organization.
NoteIn Jenkins, there is no direct equivalent of OpenShift Pipelines workspaces. You can think of the control node as a workspace, as it stores the cloned code repository, build history, and artifacts. When a job is assigned to a different node, the cloned code and the generated artifacts are stored in that node, but the control node maintains the build history.
3.1.3. Mapping of concepts
The building blocks of Jenkins and OpenShift Pipelines are not equivalent, and a specific comparison does not provide a technically accurate mapping. The following terms and concepts in Jenkins and OpenShift Pipelines correlate in general:
Jenkins | OpenShift Pipelines |
---|---|
Pipeline | Pipeline and PipelineRun |
Stage | Task |
Step | A step in a task |
3.2. Migrating a sample pipeline from Jenkins to OpenShift Pipelines
You can use the following equivalent examples to help migrate your build, test, and deploy pipelines from Jenkins to OpenShift Pipelines.
3.2.1. Jenkins pipeline
Consider a Jenkins pipeline written in Groovy for building, testing, and deploying:
pipeline { agent any stages { stage('Build') { steps { sh 'make' } } stage('Test'){ steps { sh 'make check' junit 'reports/**/*.xml' } } stage('Deploy') { steps { sh 'make publish' } } } }
3.2.2. OpenShift Pipelines pipeline
To create a pipeline in OpenShift Pipelines that is equivalent to the preceding Jenkins pipeline, you create the following three tasks:
Example build
task YAML definition file
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: myproject-build spec: workspaces: - name: source steps: - image: my-ci-image command: ["make"] workingDir: $(workspaces.source.path)
Example test
task YAML definition file
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: myproject-test spec: workspaces: - name: source steps: - image: my-ci-image command: ["make check"] workingDir: $(workspaces.source.path) - image: junit-report-image script: | #!/usr/bin/env bash junit-report reports/**/*.xml workingDir: $(workspaces.source.path)
Example deploy
task YAML definition file
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: myprojectd-deploy spec: workspaces: - name: source steps: - image: my-deploy-image command: ["make deploy"] workingDir: $(workspaces.source.path)
You can combine the three tasks sequentially to form a pipeline in OpenShift Pipelines:
Example: OpenShift Pipelines pipeline for building, testing, and deployment
apiVersion: tekton.dev/v1beta1 kind: Pipeline metadata: name: myproject-pipeline spec: workspaces: - name: shared-dir tasks: - name: build taskRef: name: myproject-build workspaces: - name: source workspace: shared-dir - name: test taskRef: name: myproject-test workspaces: - name: source workspace: shared-dir - name: deploy taskRef: name: myproject-deploy workspaces: - name: source workspace: shared-dir
3.3. Migrating from Jenkins plugins to Tekton Hub tasks
You can extend the capability of Jenkins by using plugins. To achieve similar extensibility in OpenShift Pipelines, use any of the tasks available from Tekton Hub.
For example, consider the git-clone task in Tekton Hub, which corresponds to the git plugin for Jenkins.
Example: git-clone
task from Tekton Hub
apiVersion: tekton.dev/v1beta1 kind: Pipeline metadata: name: demo-pipeline spec: params: - name: repo_url - name: revision workspaces: - name: source tasks: - name: fetch-from-git taskRef: name: git-clone params: - name: url value: $(params.repo_url) - name: revision value: $(params.revision) workspaces: - name: output workspace: source
3.4. Extending OpenShift Pipelines capabilities using custom tasks and scripts
In OpenShift Pipelines, if you do not find the right task in Tekton Hub, or need greater control over tasks, you can create custom tasks and scripts to extend the capabilities of OpenShift Pipelines.
Example: A custom task for running the maven test
command
apiVersion: tekton.dev/v1beta1 kind: Task metadata: name: maven-test spec: workspaces: - name: source steps: - image: my-maven-image command: ["mvn test"] workingDir: $(workspaces.source.path)
Example: Run a custom shell script by providing its path
... steps: image: ubuntu script: | #!/usr/bin/env bash /workspace/my-script.sh ...
Example: Run a custom Python script by writing it in the YAML file
... steps: image: python script: | #!/usr/bin/env python3 print(“hello from python!”) ...
3.5. Comparison of Jenkins and OpenShift Pipelines execution models
Jenkins and OpenShift Pipelines offer similar functions but are different in architecture and execution.
Jenkins | OpenShift Pipelines |
---|---|
Jenkins has a controller node. Jenkins runs pipelines and steps centrally, or orchestrates jobs running in other nodes. | OpenShift Pipelines is serverless and distributed, and there is no central dependency for execution. |
Containers are launched by the Jenkins controller node through the pipeline. | OpenShift Pipelines adopts a 'container-first' approach, where every step runs as a container in a pod (equivalent to nodes in Jenkins). |
Extensibility is achieved by using plugins. | Extensibility is achieved by using tasks in Tekton Hub or by creating custom tasks and scripts. |
3.6. Examples of common use cases
Both Jenkins and OpenShift Pipelines offer capabilities for common CI/CD use cases, such as:
- Compiling, building, and deploying images using Apache Maven
- Extending the core capabilities by using plugins
- Reusing shareable libraries and custom scripts
3.6.1. Running a Maven pipeline in Jenkins and OpenShift Pipelines
You can use Maven in both Jenkins and OpenShift Pipelines workflows for compiling, building, and deploying images. To map your existing Jenkins workflow to OpenShift Pipelines, consider the following examples:
Example: Compile and build an image and deploy it to OpenShift using Maven in Jenkins
#!/usr/bin/groovy node('maven') { stage 'Checkout' checkout scm stage 'Build' sh 'cd helloworld && mvn clean' sh 'cd helloworld && mvn compile' stage 'Run Unit Tests' sh 'cd helloworld && mvn test' stage 'Package' sh 'cd helloworld && mvn package' stage 'Archive artifact' sh 'mkdir -p artifacts/deployments && cp helloworld/target/*.war artifacts/deployments' archive 'helloworld/target/*.war' stage 'Create Image' sh 'oc login https://kubernetes.default -u admin -p admin --insecure-skip-tls-verify=true' sh 'oc new-project helloworldproject' sh 'oc project helloworldproject' sh 'oc process -f helloworld/jboss-eap70-binary-build.json | oc create -f -' sh 'oc start-build eap-helloworld-app --from-dir=artifacts/' stage 'Deploy' sh 'oc new-app helloworld/jboss-eap70-deploy.json' }
Example: Compile and build an image and deploy it to OpenShift using Maven in OpenShift Pipelines.
apiVersion: tekton.dev/v1beta1 kind: Pipeline metadata: name: maven-pipeline spec: workspaces: - name: shared-workspace - name: maven-settings - name: kubeconfig-dir optional: true params: - name: repo-url - name: revision - name: context-path tasks: - name: fetch-repo taskRef: name: git-clone workspaces: - name: output workspace: shared-workspace params: - name: url value: "$(params.repo-url)" - name: subdirectory value: "" - name: deleteExisting value: "true" - name: revision value: $(params.revision) - name: mvn-build taskRef: name: maven runAfter: - fetch-repo workspaces: - name: source workspace: shared-workspace - name: maven-settings workspace: maven-settings params: - name: CONTEXT_DIR value: "$(params.context-path)" - name: GOALS value: ["-DskipTests", "clean", "compile"] - name: mvn-tests taskRef: name: maven runAfter: - mvn-build workspaces: - name: source workspace: shared-workspace - name: maven-settings workspace: maven-settings params: - name: CONTEXT_DIR value: "$(params.context-path)" - name: GOALS value: ["test"] - name: mvn-package taskRef: name: maven runAfter: - mvn-tests workspaces: - name: source workspace: shared-workspace - name: maven-settings workspace: maven-settings params: - name: CONTEXT_DIR value: "$(params.context-path)" - name: GOALS value: ["package"] - name: create-image-and-deploy taskRef: name: openshift-client runAfter: - mvn-package workspaces: - name: manifest-dir workspace: shared-workspace - name: kubeconfig-dir workspace: kubeconfig-dir params: - name: SCRIPT value: | cd "$(params.context-path)" mkdir -p ./artifacts/deployments && cp ./target/*.war ./artifacts/deployments oc new-project helloworldproject oc project helloworldproject oc process -f jboss-eap70-binary-build.json | oc create -f - oc start-build eap-helloworld-app --from-dir=artifacts/ oc new-app jboss-eap70-deploy.json
3.6.2. Extending the core capabilities of Jenkins and OpenShift Pipelines by using plugins
Jenkins has the advantage of a large ecosystem of numerous plugins developed over the years by its extensive user base. You can search and browse the plugins in the Jenkins Plugin Index.
OpenShift Pipelines also has many tasks developed and contributed by the community and enterprise users. A publicly available catalog of reusable OpenShift Pipelines tasks are available in the Tekton Hub.
In addition, OpenShift Pipelines incorporates many of the plugins of the Jenkins ecosystem within its core capabilities. For example, authorization is a critical function in both Jenkins and OpenShift Pipelines. While Jenkins ensures authorization using the Role-based Authorization Strategy plugin, OpenShift Pipelines uses OpenShift’s built-in Role-based Access Control system.
3.6.3. Sharing reusable code in Jenkins and OpenShift Pipelines
Jenkins shared libraries provide reusable code for parts of Jenkins pipelines. The libraries are shared between Jenkinsfiles to create highly modular pipelines without code repetition.
Although there is no direct equivalent of Jenkins shared libraries in OpenShift Pipelines, you can achieve similar workflows by using tasks from the Tekton Hub in combination with custom tasks and scripts.