Chapter 1. Getting started overview
Use Red Hat Streams for Apache Kafka to create and set up Kafka clusters, then connect your applications and services to those clusters.
This guide describes how to install and start using Streams for Apache Kafka on OpenShift Container Platform. You can install the Streams for Apache Kafka operator directly from the OperatorHub in the OpenShift web console. The Streams for Apache Kafka operator understands how to install and manage Kafka components. Installing from the OperatorHub provides a standard configuration of Streams for Apache Kafka that allows you to take advantage of automatic updates.
When the Streams for Apache Kafka operator is installed, it provides the resources to install instances of Kafka components. After installing a Kafka cluster, you can start producing and consuming messages.
If you require more flexibility with your deployment, you can use the installation artifacts provided with Streams for Apache Kafka. For more information on using the installation artifacts, see Deploying and Managing Streams for Apache Kafka on OpenShift.
1.1. Prerequisites Copy linkLink copied to clipboard!
The following prerequisites are required for getting started with Streams for Apache Kafka.
- You have a Red Hat account.
- JDK 11 or later is installed.
- An OpenShift 4.14 and later cluster is available.
-
The OpenShift
oc
command-line tool is installed and configured to connect to the running cluster.
The steps to get started are based on using the OperatorHub in the OpenShift web console, but you’ll also use the OpenShift oc
CLI tool to perform certain operations. You’ll need to connect to your OpenShift cluster using the oc
tool.
-
You can install the
oc
CLI tool from the web console by clicking the'?'
help menu, then Command Line Tools. -
You can copy the required
oc login
details from the web console by clicking your profile name, then Copy login command.