이 콘텐츠는 선택한 언어로 제공되지 않습니다.
Chapter 3. Supported deployment environments
The following deployment environments for Red Hat AI Inference Server are supported.
| Environment | Supported versions | Deployment notes |
|---|---|---|
| OpenShift Container Platform (self‑managed) | 4.14 – 4.18 | Deploy on bare‑metal hosts or virtual machines. |
| Red Hat OpenShift Service on AWS (ROSA) | 4.14 – 4.18 | Requires ROSA STS cluster with GPU‑enabled P5 or G5 node types. |
| Red Hat Enterprise Linux (RHEL) | 9.2 – 10.0 | Deploy on bare‑metal hosts or virtual machines. |
| Linux (not RHEL) | - | Supported under third‑party policy deployed on bare‑metal hosts or virtual machines. OpenShift Container Platform Operators are not required. |
| Kubernetes (not OpenShift Container Platform) | - | Supported under third‑party policy deployed on bare‑metal hosts or virtual machines. |
Red Hat AI Inference Server is available only as a container image. The host operating system and kernel must support the required accelerator drivers. For more information, see Supported AI accelerators.