How to run and deploy LLMs using Red Hat OpenShift AI on a Red Hat OpenShift Service on AWS cluster

Learn how to install the Red Hat® OpenShift® AI (RHOAI) operator and Jupyter notebook, create an Amazon S3 bucket, and run the LLM model on a Red Hat OpenShift Service on AWS (ROSA) cluster.

Disclaimer: this content is authored by Red Hat experts, but has not yet been tested on every supported configuration.

This learning path is for operations teams or system administrators.

Developers might want to check out how to create a natural language processing (NLP) application using Red Hat OpenShift AI on developers.redhat.com.

Get started on developers.redhat.com

Next steps after running and deploying an LLM using Red Hat OpenShift AI

2 mins

Congratulations! You have deployed and trained an LLM using Red Hat® OpenShift® AI (RHOAI) on a Red Hat OpenShift Service on AWS (ROSA) cluster.

After completing this tutorial, you now have experience:

  • Installing RHOAI and Jupyter notebook
  • Creating and granting access to S3 bucket
  • Training LLM model
  • Future research
  • Performing hyperparameter tuning 

What comes next?

Next, watch a demonstration of a typical RHOAI workflow that includes text-to-image generation, creating a project, launching a Jupyter notebook with appropriate cluster resources, and training a foundation model from Hugging Face with one’s own data. Once the model is fine-tuned, the demonstrator also automates the build using a data science pipeline and serves the model for use in an AI-enabled application. 

Red Hat logoGithubredditYoutubeTwitter

자세한 정보

평가판, 구매 및 판매

커뮤니티

Red Hat 문서 정보

Red Hat을 사용하는 고객은 신뢰할 수 있는 콘텐츠가 포함된 제품과 서비스를 통해 혁신하고 목표를 달성할 수 있습니다. 최신 업데이트를 확인하세요.

보다 포괄적 수용을 위한 오픈 소스 용어 교체

Red Hat은 코드, 문서, 웹 속성에서 문제가 있는 언어를 교체하기 위해 최선을 다하고 있습니다. 자세한 내용은 다음을 참조하세요.Red Hat 블로그.

Red Hat 소개

Red Hat은 기업이 핵심 데이터 센터에서 네트워크 에지에 이르기까지 플랫폼과 환경 전반에서 더 쉽게 작업할 수 있도록 강화된 솔루션을 제공합니다.

Theme

© 2026 Red Hat
맨 위로 이동