이 콘텐츠는 선택한 언어로 제공되지 않습니다.
Inference serving language models in OCI-compliant model containers
Red Hat AI Inference Server 3.2
Inferencing OCI-compliant models in Red Hat AI Inference Server
Abstract
Move language models from a local or public registry to OpenShift clusters in a fully supported, GPU-accelerated path by using OCI container mounts.