Chapter 5. Sending and receiving messages from a topic
Send messages to and receive messages from a Kafka cluster installed on OpenShift.
This procedure describes how to use Kafka clients to produce and consume messages. You can deploy clients to OpenShift or connect local Kafka clients to the OpenShift cluster. You can use either or both options to test your Kafka cluster installation. For the local clients, you access the Kafka cluster using an OpenShift route connection.
You will use the oc
command-line tool to deploy and run the Kafka clients.
Prerequisites
- You have created a Kafka cluster on OpenShift.
For a local producer and consumer:
- You have created a route for external access to the Kafka cluster running in OpenShift.
- You can access the latest Kafka client binaries from the AMQ Streams software downloads page.
Sending and receiving messages from Kafka clients deployed to the OpenShift cluster
Deploy producer and consumer clients to the OpenShift cluster. You can then use the clients to send and receive messages from the Kafka cluster in the same namespace. The deployment uses the AMQ Streams container image for running Kafka.
Use the
oc
command-line interface to deploy a Kafka producer.This example deploys a Kafka producer that connects to the Kafka cluster
my-cluster
A topic named
my-topic
is created.Deploying a Kafka producer to OpenShift
oc run kafka-producer -ti \ --image=registry.redhat.io/amq-streams/kafka-34-rhel8:2.4.0 \ --rm=true \ --restart=Never \ -- bin/kafka-console-producer.sh \ --bootstrap-server my-cluster-kafka-bootstrap:9092 \ --topic my-topic
NoteIf the connection fails, check that the Kafka cluster is running and the correct cluster name is specified as the
bootstrap-server
.- From the command prompt, enter a number of messages.
-
Navigate in the OpenShift web console to the Home > Projects page and select the
amq-streams-kafka
project you created. -
From the list of pods, click
kafka-producer
to view the producer pod details. - Select Logs page to check the messages you entered are present.
Use the
oc
command-line interface to deploy a Kafka consumer.Deploying a Kafka consumer to OpenShift
oc run kafka-consumer -ti \ --image=registry.redhat.io/amq-streams/kafka-34-rhel8:2.4.0 \ --rm=true \ --restart=Never \ -- bin/kafka-console-consumer.sh \ --bootstrap-server my-cluster-kafka-bootstrap:9092 \ --topic my-topic \ --from-beginning
The consumer consumed messages produced to
my-topic
.- From the command prompt, confirm that you see the incoming messages in the consumer console.
-
Navigate in the OpenShift web console to the Home > Projects page and select the
amq-streams-kafka
project you created. -
From the list of pods, click
kafka-consumer
to view the consumer pod details. - Select the Logs page to check the messages you consumed are present.
Sending and receiving messages from Kafka clients running locally
Use a command-line interface to run a Kafka producer and consumer on a local machine.
Download and extract the AMQ Streams <version> binaries from the AMQ Streams software downloads page.
Unzip the
amq-streams-<version>-bin.zip
file to any destination.Open a command-line interface, and start the Kafka console producer with the topic
my-topic
and the authentication properties for TLS.Add the properties that are required for accessing the Kafka broker with an OpenShift route.
- Use the hostname and port 443 for the OpenShift route you are using.
Use the password and reference to the truststore you created for the broker certificate.
Starting a local Kafka producer
kafka-console-producer.sh \ --bootstrap-server my-cluster-kafka-listener1-bootstrap-amq-streams-kafka.apps.ci-ln-50kcyvt-72292.origin-ci-int-gce.dev.rhcloud.com:443 \ --producer-property security.protocol=SSL \ --producer-property ssl.truststore.password=password \ --producer-property ssl.truststore.location=client.truststore.jks \ --topic my-topic
- Type your message into the command-line interface where the producer is running.
- Press enter to send the message.
Open a new command-line interface tab or window, and start the Kafka console consumer to receive the messages.
Use the same connection details as the producer.
Starting a local Kafka consumer
kafka-console-consumer.sh \ --bootstrap-server my-cluster-kafka-listener1-bootstrap-amq-streams-kafka.apps.ci-ln-50kcyvt-72292.origin-ci-int-gce.dev.rhcloud.com:443 \ --consumer-property security.protocol=SSL \ --consumer-property ssl.truststore.password=password \ --consumer-property ssl.truststore.location=client.truststore.jks \ --topic my-topic --from-beginning
- Confirm that you see the incoming messages in the consumer console.
- Press Crtl+C to exit the Kafka console producer and consumer.