Este contenido no está disponible en el idioma seleccionado.

Chapter 4. Working with pipeline logs


4.1. About pipeline logs

You can review and analyze step logs for each step in a triggered pipeline run.

To help you troubleshoot and audit your pipelines, you can review and analyze these step logs by using the log viewer in the OpenShift AI dashboard. From here, you can search for specific log messages, view the log for each step, and download the step logs to your local machine.

If the step log file exceeds its capacity, a warning appears above the log viewer stating that the log window displays partial content. Expanding the warning displays further information, such as how the log viewer refreshes every three seconds, and that each step log displays the last 500 lines of log messages received. In addition, you can click download all step logs to download all step logs to your local machine.

Each step has a set of container logs. You can view these container logs by selecting a container from the Steps list in the log viewer. The Step-main container log consists of the log output for the step. The step-copy-artifact container log consists of output relating to artifact data sent to s3-compatible storage. If the data transferred between the steps in your pipeline is larger than 3 KB, five container logs are typically available. These logs contain output relating to data transferred between your persistent volume claims (PVCs).

4.2. Viewing pipeline step logs

To help you troubleshoot and audit your pipelines, you can review and analyze the log of each pipeline step using the log viewer. From here, you can search for specific log messages and download the logs for each step in your pipeline. If the pipeline is running, you can also pause and resume the log from the log viewer.

Note

Logs are no longer stored in S3-compatible storage for Python scripts which are running in Elyra pipelines. From OpenShift AI version 2.11, you can view these logs in the pipeline step log viewer.

For this change to take effect, you must use the Elyra runtime images provided in workbench images at version 2024.1 or later.

If you have an older workbench image version, update the Version selection field to a compatible workbench image version, for example, 2024.1, as described in Updating a project workbench.

Updating your workbench image version will clear any existing runtime image selections for your pipeline. After you update your workbench version, open your workbench IDE and update the properties of your pipeline to select a runtime image.

Prerequisites

  • You have logged in to Red Hat OpenShift AI.
  • If you are using OpenShift AI groups, you are part of the user group or admin group (for example, rhoai-users or rhoai-admins) in OpenShift.
  • You have previously created a data science project that is available and contains a pipeline server.
  • You have imported a pipeline to an active pipeline server.
  • You have previously triggered a pipeline run.

Procedure

  1. From the OpenShift AI dashboard, click Data science pipelines Runs.
  2. On the Runs page, from the Project drop-down list, select the project that you want to view pipeline step logs for.
  3. On the Runs page, click the name of the run that you want to view logs for.
  4. On the run details page, on the Graph tab, click the pipeline step that you want to view logs for.
  5. Click the Logs tab.
  6. To view the logs of another pipeline step, from the Steps list, select the step that you want to view logs for.
  7. Analyze the log using the log viewer.

    • To search for a specific log message, enter at least part of the message in the search bar.
    • To view the full log in a separate browser window, click the action menu (⋮) and select View raw logs. Alternatively, to expand the size of the log viewer, click the action menu (⋮) and select Expand.

Verification

  • You can view the logs for each step in your pipeline.

4.3. Downloading pipeline step logs

Instead of viewing the step logs of a pipeline run using the log viewer on the OpenShift AI dashboard, you can download them for further analysis. You can choose to download the logs belonging to all steps in your pipeline, or you can download the log only for the step log displayed in the log viewer.

Prerequisites

  • You have logged in to Red Hat OpenShift AI.
  • If you are using OpenShift AI groups, you are part of the user group or admin group (for example, rhoai-users or rhoai-admins) in OpenShift.
  • You have previously created a data science project that is available and contains a pipeline server.
  • You have imported a pipeline to an active pipeline server.
  • You have previously triggered a pipeline run.

Procedure

  1. From the OpenShift AI dashboard, click Data science pipelines Runs.
  2. On the Runs page, from the Project drop-down list, select the project that you want to download logs for.
  3. On the Runs page, click the name of the run that you want to download logs for.
  4. On the run details page, on the Graph tab, click the pipeline step that you want to download logs for.
  5. Click the Logs tab.
  6. In the log viewer, click the Download button ( rhoai download icon ).

    1. Select Download current stop log to download the log for the current pipeline step.
    2. Select Download all step logs to download the logs for all steps in your pipeline run.

Verification

  • The step logs download to your browser’s default directory for downloaded files.
Volver arriba
Red Hat logoGithubredditYoutubeTwitter

Aprender

Pruebe, compre y venda

Comunidades

Acerca de la documentación de Red Hat

Ayudamos a los usuarios de Red Hat a innovar y alcanzar sus objetivos con nuestros productos y servicios con contenido en el que pueden confiar. Explore nuestras recientes actualizaciones.

Hacer que el código abierto sea más inclusivo

Red Hat se compromete a reemplazar el lenguaje problemático en nuestro código, documentación y propiedades web. Para más detalles, consulte el Blog de Red Hat.

Acerca de Red Hat

Ofrecemos soluciones reforzadas que facilitan a las empresas trabajar en plataformas y entornos, desde el centro de datos central hasta el perímetro de la red.

Theme

© 2025 Red Hat