이 콘텐츠는 선택한 언어로 제공되지 않습니다.
Preface
You can inference large language models with Red Hat AI Inference Server without any connection to the outside internet by installing OpenShift Container Platform and configuring a mirrored container image registry in the disconnected environment.
Important
Currently, only NVIDIA accelerators are supported in disconnected environments on OpenShift Container Platform.