Este contenido no está disponible en el idioma seleccionado.

Chapter 7. Pipelines as Code command reference


You can use the tkn pac CLI tool to control Pipelines as Code. You can also configure Pipelines as Code logging with the TektonConfig custom resource and use the oc command to view Pipelines as Code logs.

7.1. Pipelines as Code command reference

The tkn pac CLI tool offers the following capabilities:

  • Bootstrap Pipelines as Code installation and configuration.
  • Create a new Pipelines as Code repository.
  • List all Pipelines as Code repositories.
  • Describe a Pipelines as Code repository and the associated runs.
  • Generate a simple pipeline run to get started.
  • Resolve a pipeline run as if it was executed by Pipelines as Code.
Tip

You can use the commands corresponding to the capabilities for testing and experimentation, so that you don’t have to make changes to the Git repository containing the application source code.

7.1.1. Basic syntax

$ tkn pac [command or options] [arguments]

7.1.2. Global options

$ tkn pac --help

7.1.3. Utility commands

7.1.3.1. bootstrap

Table 7.1. Bootstrapping Pipelines as Code installation and configuration
CommandDescription

tkn pac bootstrap

Installs and configures Pipelines as Code for Git repository hosting service providers, such as GitHub and GitHub Enterprise.

tkn pac bootstrap --nightly

Installs the nightly build of Pipelines as Code.

tkn pac bootstrap --route-url <public_url_to_ingress_spec>

Overrides the OpenShift route URL.

By default, tkn pac bootstrap detects the OpenShift route, which is automatically associated with the Pipelines as Code controller service.

If you do not have an OpenShift Container Platform cluster, it asks you for the public URL that points to the ingress endpoint.

tkn pac bootstrap github-app

Create a GitHub application and secrets in the openshift-pipelines namespace.

7.1.3.2. repository

Table 7.2. Managing Pipelines as Code repositories
CommandDescription

tkn pac create repository

Creates a new Pipelines as Code repository and a namespace based on the pipeline run template.

tkn pac list

Lists all the Pipelines as Code repositories and displays the last status of the associated runs.

tkn pac repo describe

Describes a Pipelines as Code repository and the associated runs.

7.1.3.3. generate

Table 7.3. Generating pipeline runs using Pipelines as Code
CommandDescription

tkn pac generate

Generates a simple pipeline run.

When executed from the directory containing the source code, it automatically detects current Git information.

In addition, it uses basic language detection capability and adds extra tasks depending on the language.

For example, if it detects a setup.py file at the repository root, the pylint task is automatically added to the generated pipeline run.

7.1.3.4. resolve

Table 7.4. Resolving and executing pipeline runs using Pipelines as Code
CommandDescription

tkn pac resolve

Executes a pipeline run as if it is owned by the Pipelines as Code on service.

tkn pac resolve -f .tekton/pull-request.yaml | oc apply -f -

Displays the status of a live pipeline run that uses the template in .tekton/pull-request.yaml.

Combined with a Kubernetes installation running on your local machine, you can observe the pipeline run without generating a new commit.

If you run the command from a source code repository, it attempts to detect the current Git information and automatically resolve parameters such as current revision or branch.

tkn pac resolve -f .tekton/pr.yaml -p revision=main -p repo_name=<repository_name>

Executes a pipeline run by overriding default parameter values derived from the Git repository.

The -f option can also accept a directory path and apply the tkn pac resolve command on all .yaml or .yml files in that directory. You can also use the -f flag multiple times in the same command.

You can override the default information gathered from the Git repository by specifying parameter values using the -p option. For example, you can use a Git branch as a revision and a different repository name.

7.2. Configuring Pipelines as Code logging

You can configure Pipelines as Code logging by editing the pac-config-logging config map in the TektonConfig custom resource (CR).

Prerequisites

  • You have Pipelines as Code installed on your cluster.

Procedure

  1. In the Administrator perspective of the web console, go to Administration CustomResourceDefinitions.
  2. Use the Search by name field to search for the tektonconfigs.operator.tekton.dev custom resource definition (CRD) and click TektonConfig to view the CRD Details page.
  3. Click the Instances tab.
  4. Click the config instance to view the TektonConfig CR details.
  5. Click the YAML tab.
  6. Edit the loglevel. fields under the .options.configMaps.pac-config-logging.data parameter based on your requirements.

    Example TektonConfig CR with the Pipelines as Code log level fields set to warn

    apiVersion: operator.tekton.dev/v1alpha1
    kind: TektonConfig
    metadata:
      name: config
    spec:
      platforms:
        openshift:
          pipelinesAsCode:
            options:
              configMaps:
                pac-config-logging:
                  data:
                    loglevel.pac-watcher: warn 1
                    loglevel.pipelines-as-code-webhook: warn 2
                    loglevel.pipelinesascode: warn 3
                    zap-logger-config: |
                      {
                        "level": "info",
                        "development": false,
                        "sampling": {
                          "initial": 100,
                          "thereafter": 100
                        },
                        "outputPaths": ["stdout"],
                        "errorOutputPaths": ["stderr"],
                        "encoding": "json",
                        "encoderConfig": {
                          "timeKey": "ts",
                          "levelKey": "level",
                          "nameKey": "logger",
                          "callerKey": "caller",
                          "messageKey": "msg",
                          "stacktraceKey": "stacktrace",
                          "lineEnding": "",
                          "levelEncoder": "",
                          "timeEncoder": "iso8601",
                          "durationEncoder": "",
                          "callerEncoder": ""
                        }
                      }

    1
    The log level for the pipelines-as-code-watcher component.
    2
    The log level for the pipelines-as-code-webhook component.
    3
    The log level for the pipelines-as-code-controller component.
  7. Optional: Create a custom logging config map for the Pipelines as Code components by changing the .env.value for each component under the .options.deployments field. The example below shows the configuration with the custom config map called custom-pac-config-logging.

    Example TektonConfig CR with the Pipelines as Code custom logging config map

    apiVersion: operator.tekton.dev/v1alpha1
    kind: TektonConfig
    metadata:
      name: config
    spec:
      platforms:
        openshift:
          pipelinesAsCode:
            enable: true
            options:
              configMaps:
                custom-pac-config-logging:
                  data:
                    loglevel.pac-watcher: warn
                    loglevel.pipelines-as-code-webhook: warn
                    loglevel.pipelinesascode: warn
                    zap-logger-config: |
                      {
                        "level": "info",
                        "development": false,
                        "sampling": {
                          "initial": 100,
                          "thereafter": 100
                        },
                        "outputPaths": ["stdout"],
                        "errorOutputPaths": ["stderr"],
                        "encoding": "json",
                        "encoderConfig": {
                          "timeKey": "ts",
                          "levelKey": "level",
                          "nameKey": "logger",
                          "callerKey": "caller",
                          "messageKey": "msg",
                          "stacktraceKey": "stacktrace",
                          "lineEnding": "",
                          "levelEncoder": "",
                          "timeEncoder": "iso8601",
                          "durationEncoder": "",
                          "callerEncoder": ""
                        }
                      }
              deployments:
                pipelines-as-code-controller:
                  spec:
                    template:
                      spec:
                        containers:
                        - name: pac-controller
                          env:
                          - name: CONFIG_LOGGING_NAME
                            value: custom-pac-config-logging
                pipelines-as-code-watcher:
                  spec:
                    template:
                      spec:
                        containers:
                        - name: pac-watcher
                          env:
                          - name: CONFIG_LOGGING_NAME
                            value: custom-pac-config-logging
                pipelines-as-code-webhook:
                  spec:
                    template:
                      spec:
                        containers:
                        - name: pac-webhook
                          env:
                          - name: CONFIG_LOGGING_NAME
                            value: custom-pac-config-logging

7.3. Splitting Pipelines as Code logs by namespace

Pipelines as Code logs contain the namespace information to make it possible to filter logs or split the logs by a particular namespace. For example, to view the Pipelines as Code logs related to the mynamespace namespace, enter the following command:

$ oc logs pipelines-as-code-controller-<unique-id> -n openshift-pipelines | grep mynamespace 1
1
Replace pipelines-as-code-controller-<unique-id> with the Pipelines as Code controller name.

7.4. Additional resources

Red Hat logoGithubRedditYoutubeTwitter

Aprender

Pruebe, compre y venda

Comunidades

Acerca de la documentación de Red Hat

Ayudamos a los usuarios de Red Hat a innovar y alcanzar sus objetivos con nuestros productos y servicios con contenido en el que pueden confiar.

Hacer que el código abierto sea más inclusivo

Red Hat se compromete a reemplazar el lenguaje problemático en nuestro código, documentación y propiedades web. Para más detalles, consulte el Blog de Red Hat.

Acerca de Red Hat

Ofrecemos soluciones reforzadas que facilitan a las empresas trabajar en plataformas y entornos, desde el centro de datos central hasta el perímetro de la red.

© 2024 Red Hat, Inc.