Chapter 1. Understand how AI assets map to the Red Hat Developer Hub Catalog


Important

This section describes Developer Preview features in the OpenShift AI Connector for Red Hat Developer Hub plugin. Developer Preview features are not supported by Red Hat in any way and are not functionally complete or production-ready. Do not use Developer Preview features for production or business-critical workloads. Developer Preview features provide early access to functionality in advance of possible inclusion in a Red Hat product offering. Customers can use these features to test functionality and provide feedback during the development process. Developer Preview features might not have any documentation, are subject to change or removal at any time, and have received limited testing. Red Hat might provide ways to submit feedback on Developer Preview features without an associated SLA.

For more information about the support scope of Red Hat Developer Preview features, see Developer Preview Support Scope.

The OpenShift AI Connector for Red Hat Developer Hub (OpenShift AI Connector for RHDH) serves as a crucial link, enabling the discovery and accessibility of AI assets managed within the Red Hat OpenShift AI offering directly within your RHDH instance.

For more information on model registry components, see Overview of model registries and model catalog.

1.1. Model-to-Entity mapping

Model-to-Entity mapping integrates with OpenShift AI Connector for RHDH, the model catalog, and KServe-based Model Deployments (InferenceServices). This integration automatically converts your AI/ML artifacts into familiar Backstage entities, simplifying management and providing a unified view of your available AI models to your developer teams.

This offering interfaces with the OpenShift AI Connector for RHDH, model catalog, and KServe-based Model Deployments (InferenceServices) to create familiar Backstage entities.

Expand
RHOAI ArtifactRHDH/Backstage Entity KindRHDH/Backstage Entity TypePurpose

Model Server (InferenceService)

Component

model-server

Represents a running, accessible AI model endpoint. See Configuring your model-serving platform.

AI Model (Model Registry Version)

Resource

ai-model

Represents the specific AI model artifact, for example, Llama-3-8B.

Model Server API Details

API

openapi (Default)

Provides the OpenAPI/Swagger specification for the REST endpoint of the model. See Red Hat OpenShifT AI: API Tiers

Model Cards

TechDocs

N/A

Model cards from the RHOAI model catalog are associated with the Component and Resource entities. See Registering a model from the model catalog.

Once the OpenShift AI Connector for RHDH is installed and connected with RHOAI, the transfer of information commences automatically.

The connector propagates the following key data:

  • InferenceServices (Component type model-server):

    • URL of the OpenShift Route (if exposed).
    • URL of the Kubernetes Service.
    • Authentication requirement status.
  • Model registry (Resource type ai-model):

    • Model description, artifact URIs, and author/owner information.
  • Model catalog:

    • Links to the Model Card (as RHDH TechDocs).
    • Model license URL.
Red Hat logoGithubredditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust. Explore our recent updates.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Theme

© 2026 Red Hat
Back to top