Preface
Deploy Mistral 3 models with Red Hat AI Inference Server, including the Mistral Large 3 Mixture-of-Experts model and the Ministral 3 dense model family optimized for edge deployments. The Mistral 3 family includes models released under the Apache 2.0 license with open weights, suitable for on-premise and hybrid-cloud deployments. All models include native multimodal capabilities, tool calling support, and large context windows.