Chapter 3. Reviewing AI Inference Server Python packages
You can review the Python packages installed in the Red Hat AI Inference Server container image by running the container with Podman and reviewing the pip list package
output.
Prerequisites
- You have installed Podman or Docker.
- You are logged in as a user with sudo access.
-
You have access to
registry.redhat.io
and have logged in.
Procedure
Run the Red Hat AI Inference Server container image with the
pip list package
command to view all installed Python packages. For example:podman run --rm --entrypoint=/bin/bash \ registry.redhat.io/rhaiis/vllm-cuda-rhel9:3.2.3 \ -c "pip list"
$ podman run --rm --entrypoint=/bin/bash \ registry.redhat.io/rhaiis/vllm-cuda-rhel9:3.2.3 \ -c "pip list"
Copy to Clipboard Copied! Toggle word wrap Toggle overflow To view detailed information about a specific package, run the Podman command with
pip show <package_name>
. For example:podman run --rm --entrypoint=/bin/bash \ registry.redhat.io/rhaiis/vllm-cuda-rhel9:3.2.3 \ -c "pip show vllm"
$ podman run --rm --entrypoint=/bin/bash \ registry.redhat.io/rhaiis/vllm-cuda-rhel9:3.2.3 \ -c "pip show vllm"
Copy to Clipboard Copied! Toggle word wrap Toggle overflow Example output
Name: vllm Version: v0.11.0
Name: vllm Version: v0.11.0
Copy to Clipboard Copied! Toggle word wrap Toggle overflow