How to run and deploy LLMs using Red Hat OpenShift AI on a Red Hat OpenShift Service on AWS cluster

Learn how to install the Red Hat® OpenShift® AI (RHOAI) operator and Jupyter notebook, create an Amazon S3 bucket, and run the LLM model on a Red Hat OpenShift Service on AWS (ROSA) cluster.

Disclaimer: this content is authored by Red Hat experts, but has not yet been tested on every supported configuration.

This learning path is for operations teams or system administrators.

Developers might want to check out how to create a natural language processing (NLP) application using Red Hat OpenShift AI on developers.redhat.com.

Get started on developers.redhat.com

Prerequisites for deploying LLMs using Red Hat OpenShift AI on a Red Hat OpenShift Service on AWS cluster

2 mins

Before beginning, you’ll need a Red Hat® OpenShift® Service on AWS (ROSA) cluster. If you don’t have one already, visit our learning path Getting Started with Red Hat OpenShift Service on AWS (ROSA) to view instructions for deploying a cluster using the console interface, which you can access with the button below.

What will you learn?

  • How to deploy a ROSA cluster
  • Accessing the cluster console

What do you need before starting?

Steps for meeting the prerequisites

  1. Deploy your ROSA cluster.
    1. In this learning path, we’ll use a single-AZ ROSA 4.15.10 cluster with m5.4xlarge node with auto-scaling enabled up to 10 nodes. The cluster has 64 vCPUs with ~278Gi memory. 
    2. Note that ROSA and RHOAI also support GPU, however, for the sake of simplicity, we’ll be using only CPU for compute in this guide.
    3. Please be sure that you have admin access to the cluster.
  2. Access the cluster console.

You are now ready to install Red Hat OpenShift AI (RHOAI) and Jupyter notebook in the next resource.

Red Hat logoGithubredditYoutubeTwitter

Formazione

Prova, acquista e vendi

Community

Informazioni sulla documentazione di Red Hat

Aiutiamo gli utenti Red Hat a innovarsi e raggiungere i propri obiettivi con i nostri prodotti e servizi grazie a contenuti di cui possono fidarsi. Esplora i nostri ultimi aggiornamenti.

Rendiamo l’open source più inclusivo

Red Hat si impegna a sostituire il linguaggio problematico nel codice, nella documentazione e nelle proprietà web. Per maggiori dettagli, visita il Blog di Red Hat.

Informazioni su Red Hat

Forniamo soluzioni consolidate che rendono più semplice per le aziende lavorare su piattaforme e ambienti diversi, dal datacenter centrale all'edge della rete.

Theme

© 2026 Red Hat
Torna in cima