此内容没有您所选择的语言版本。

Chapter 5. Sending and receiving messages from a topic


Send messages to and receive messages from a Kafka cluster installed on OpenShift.

This procedure describes how to use Kafka clients to produce and consume messages. You can deploy clients to OpenShift or connect local Kafka clients to the OpenShift cluster. You can use either or both options to test your Kafka cluster installation. For the local clients, you access the Kafka cluster using an OpenShift route connection.

You will use the oc command-line tool to deploy and run the Kafka clients.

Prerequisites

For a local producer and consumer:

Sending and receiving messages from Kafka clients deployed to the OpenShift cluster

Deploy producer and consumer clients to the OpenShift cluster. You can then use the clients to send and receive messages from the Kafka cluster in the same namespace. The deployment uses the Streams for Apache Kafka container image for running Kafka.

  1. Use the oc command-line interface to deploy a Kafka producer.

    This example deploys a Kafka producer that connects to the Kafka cluster my-cluster

    A topic named my-topic is created.

    Deploying a Kafka producer to OpenShift

    oc run kafka-producer -ti \
    --image=registry.redhat.io/amq-streams/kafka-39-rhel9:2.9.3 \
    --rm=true \
    --restart=Never \
    -- bin/kafka-console-producer.sh \
    --bootstrap-server my-cluster-kafka-bootstrap:9092 \
    --topic my-topic
    Copy to Clipboard Toggle word wrap

    Note

    If the connection fails, check that the Kafka cluster is running and the correct cluster name is specified as the bootstrap-server.

  2. From the command prompt, enter a number of messages.
  3. Navigate in the OpenShift web console to the Home > Projects page and select the streams-kafka project you created.
  4. From the list of pods, click kafka-producer to view the producer pod details.
  5. Select Logs page to check the messages you entered are present.
  6. Use the oc command-line interface to deploy a Kafka consumer.

    Deploying a Kafka consumer to OpenShift

    oc run kafka-consumer -ti \
    --image=registry.redhat.io/amq-streams/kafka-39-rhel9:2.9.3 \
    --rm=true \
    --restart=Never \
    -- bin/kafka-console-consumer.sh \
    --bootstrap-server my-cluster-kafka-bootstrap:9092 \
    --topic my-topic \
    --from-beginning
    Copy to Clipboard Toggle word wrap

    The consumer consumed messages produced to my-topic.

  7. From the command prompt, confirm that you see the incoming messages in the consumer console.
  8. Navigate in the OpenShift web console to the Home > Projects page and select the streams-kafka project you created.
  9. From the list of pods, click kafka-consumer to view the consumer pod details.
  10. Select the Logs page to check the messages you consumed are present.

Sending and receiving messages from Kafka clients running locally

Use a command-line interface to run a Kafka producer and consumer on a local machine.

  1. Download and extract the Streams for Apache Kafka <version> binaries from the Streams for Apache Kafka software downloads page.

    Unzip the amq-streams-<version>-bin.zip file to any destination.

  2. Open a command-line interface, and start the Kafka console producer with the topic my-topic and the authentication properties for TLS.

    Add the properties that are required for accessing the Kafka broker with an OpenShift route.

    • Use the hostname and port 443 for the OpenShift route you are using.
    • Use the password and reference to the truststore you created for the broker certificate.

      Starting a local Kafka producer

      kafka-console-producer.sh \
      --bootstrap-server my-cluster-kafka-listener1-bootstrap-streams-kafka.apps.ci-ln-50kcyvt-72292.origin-ci-int-gce.dev.rhcloud.com:443 \
      --producer-property security.protocol=SSL \
      --producer-property ssl.truststore.password=password \
      --producer-property ssl.truststore.location=client.truststore.jks \
      --topic my-topic
      Copy to Clipboard Toggle word wrap

  3. Type your message into the command-line interface where the producer is running.
  4. Press enter to send the message.
  5. Open a new command-line interface tab or window, and start the Kafka console consumer to receive the messages.

    Use the same connection details as the producer.

    Starting a local Kafka consumer

    kafka-console-consumer.sh \
    --bootstrap-server my-cluster-kafka-listener1-bootstrap-streams-kafka.apps.ci-ln-50kcyvt-72292.origin-ci-int-gce.dev.rhcloud.com:443 \
    --consumer-property security.protocol=SSL \
    --consumer-property ssl.truststore.password=password \
    --consumer-property ssl.truststore.location=client.truststore.jks \
    --topic my-topic --from-beginning
    Copy to Clipboard Toggle word wrap

  6. Confirm that you see the incoming messages in the consumer console.
  7. Press Crtl+C to exit the Kafka console producer and consumer.
返回顶部
Red Hat logoGithubredditYoutubeTwitter

学习

尝试、购买和销售

社区

关于红帽文档

通过我们的产品和服务,以及可以信赖的内容,帮助红帽用户创新并实现他们的目标。 了解我们当前的更新.

让开源更具包容性

红帽致力于替换我们的代码、文档和 Web 属性中存在问题的语言。欲了解更多详情,请参阅红帽博客.

關於紅帽

我们提供强化的解决方案,使企业能够更轻松地跨平台和环境(从核心数据中心到网络边缘)工作。

Theme

© 2025 Red Hat