Rechercher

Ce contenu n'est pas disponible dans la langue sélectionnée.

Chapter 8. Configuring OpenShift Serverless Functions

download PDF

To improve the process of deployment of your application code, you can use OpenShift Serverless to deploy stateless, event-driven functions as a Knative service on OpenShift Container Platform. If you want to develop functions, you must complete the set up steps.

8.1. Prerequisites

To enable the use of OpenShift Serverless Functions on your cluster, you must complete the following steps:

  • The OpenShift Serverless Operator and Knative Serving are installed on your cluster.

    Note

    Functions are deployed as a Knative service. If you want to use event-driven architecture with your functions, you must also install Knative Eventing.

  • You have the oc CLI installed.
  • You have the Knative (kn) CLI installed. Installing the Knative CLI enables the use of kn func commands which you can use to create and manage functions.
  • You have installed Docker Container Engine or Podman version 3.4.7 or higher.
  • You have access to an available image registry, such as the OpenShift Container Registry.
  • If you are using Quay.io as the image registry, you must ensure that either the repository is not private, or that you have followed the OpenShift Container Platform documentation on Allowing pods to reference images from other secured registries.
  • If you are using the OpenShift Container Registry, a cluster administrator must expose the registry.

8.2. Setting up Podman

To use advanced container management features, you might want to use Podman with OpenShift Serverless Functions. To do so, you need to start the Podman service and configure the Knative (kn) CLI to connect to it.

Procedure

  1. Start the Podman service that serves the Docker API on a UNIX socket at ${XDG_RUNTIME_DIR}/podman/podman.sock:

    $ systemctl start --user podman.socket
    Note

    On most systems, this socket is located at /run/user/$(id -u)/podman/podman.sock.

  2. Establish the environment variable that is used to build a function:

    $ export DOCKER_HOST="unix://${XDG_RUNTIME_DIR}/podman/podman.sock"
  3. Run the build command inside your function project directory with the -v flag to see verbose output. You should see a connection to your local UNIX socket:

    $ kn func build -v

8.3. Setting up Podman on macOS

To use advanced container management features, you might want to use Podman with OpenShift Serverless Functions. To do so on macOS, you need to start the Podman machine and configure the Knative (kn) CLI to connect to it.

Procedure

  1. Create the Podman machine:

    $ podman machine init --memory=8192 --cpus=2 --disk-size=20
  2. Start the Podman machine, which serves the Docker API on a UNIX socket:

    $ podman machine start
    Starting machine "podman-machine-default"
    Waiting for VM ...
    Mounting volume... /Users/myuser:/Users/user
    
    [...truncated output...]
    
    You can still connect Docker API clients by setting DOCKER_HOST using the
    following command in your terminal session:
    
    	export DOCKER_HOST='unix:///Users/myuser/.local/share/containers/podman/machine/podman-machine-default/podman.sock'
    
    Machine "podman-machine-default" started successfully
    Note

    On most macOS systems, this socket is located at /Users/myuser/.local/share/containers/podman/machine/podman-machine-default/podman.sock.

  3. Establish the environment variable that is used to build a function:

    $ export DOCKER_HOST='unix:///Users/myuser/.local/share/containers/podman/machine/podman-machine-default/podman.sock'
  4. Run the build command inside your function project directory with the -v flag to see verbose output. You should see a connection to your local UNIX socket:

    $ kn func build -v

8.4. Next steps

Red Hat logoGithubRedditYoutubeTwitter

Apprendre

Essayez, achetez et vendez

Communautés

À propos de la documentation Red Hat

Nous aidons les utilisateurs de Red Hat à innover et à atteindre leurs objectifs grâce à nos produits et services avec un contenu auquel ils peuvent faire confiance.

Rendre l’open source plus inclusif

Red Hat s'engage à remplacer le langage problématique dans notre code, notre documentation et nos propriétés Web. Pour plus de détails, consultez leBlog Red Hat.

À propos de Red Hat

Nous proposons des solutions renforcées qui facilitent le travail des entreprises sur plusieurs plates-formes et environnements, du centre de données central à la périphérie du réseau.

© 2024 Red Hat, Inc.