このコンテンツは選択した言語では利用できません。

Chapter 5. Sending and receiving messages from a topic


Send messages to and receive messages from a Kafka cluster installed on OpenShift.

This procedure describes how to use Kafka clients to produce and consume messages. You can deploy clients to OpenShift or connect local Kafka clients to the OpenShift cluster. You can use either or both options to test your Kafka cluster installation. For the local clients, you access the Kafka cluster using an OpenShift route connection.

You will use the oc command-line tool to deploy and run the Kafka clients.

Prerequisites

For a local producer and consumer:

Sending and receiving messages from Kafka clients deployed to the OpenShift cluster

Deploy producer and consumer clients to the OpenShift cluster. You can then use the clients to send and receive messages from the Kafka cluster in the same namespace. The deployment uses the Streams for Apache Kafka container image for running Kafka.

  1. Use the oc command-line interface to deploy a Kafka producer.

    This example deploys a Kafka producer that connects to the Kafka cluster my-cluster

    A topic named my-topic is created.

    Deploying a Kafka producer to OpenShift

    oc run kafka-producer -ti \
    --image=registry.redhat.io/amq-streams/kafka-41-rhel9:3.1.0 \
    --rm=true \
    --restart=Never \
    -- bin/kafka-console-producer.sh \
    --bootstrap-server my-cluster-kafka-bootstrap:9092 \
    --topic my-topic
    Copy to Clipboard Toggle word wrap

    Note

    If the connection fails, check that the Kafka cluster is running and the correct cluster name is specified as the bootstrap-server.

  2. From the command prompt, enter a number of messages.
  3. Navigate in the OpenShift web console to the Home > Projects page and select the streams-kafka project you created.
  4. From the list of pods, click kafka-producer to view the producer pod details.
  5. Select Logs page to check the messages you entered are present.
  6. Use the oc command-line interface to deploy a Kafka consumer.

    Deploying a Kafka consumer to OpenShift

    oc run kafka-consumer -ti \
    --image=registry.redhat.io/amq-streams/kafka-41-rhel9:3.1.0 \
    --rm=true \
    --restart=Never \
    -- bin/kafka-console-consumer.sh \
    --bootstrap-server my-cluster-kafka-bootstrap:9092 \
    --topic my-topic \
    --from-beginning
    Copy to Clipboard Toggle word wrap

    The consumer consumed messages produced to my-topic.

  7. From the command prompt, confirm that you see the incoming messages in the consumer console.
  8. Navigate in the OpenShift web console to the Home > Projects page and select the streams-kafka project you created.
  9. From the list of pods, click kafka-consumer to view the consumer pod details.
  10. Select the Logs page to check the messages you consumed are present.

Sending and receiving messages from Kafka clients running locally

Use a command-line interface to run a Kafka producer and consumer on a local machine.

  1. Download and extract the Streams for Apache Kafka <version> binaries from the Streams for Apache Kafka software downloads page.

    Unzip the amq-streams-<version>-bin.zip file to any destination.

  2. Open a command-line interface, and start the Kafka console producer with the topic my-topic and the authentication properties for TLS.

    Add the properties that are required for accessing the Kafka broker with an OpenShift route.

    • Use the hostname and port 443 for the OpenShift route you are using.
    • Use the password and reference to the truststore you created for the broker certificate.

      Starting a local Kafka producer

      kafka-console-producer.sh \
      --bootstrap-server my-cluster-kafka-listener1-bootstrap-streams-kafka.apps.ci-ln-50kcyvt-72292.origin-ci-int-gce.dev.rhcloud.com:443 \
      --producer-property security.protocol=SSL \
      --producer-property ssl.truststore.password=password \
      --producer-property ssl.truststore.location=client.truststore.jks \
      --topic my-topic
      Copy to Clipboard Toggle word wrap

  3. Type your message into the command-line interface where the producer is running.
  4. Press enter to send the message.
  5. Open a new command-line interface tab or window, and start the Kafka console consumer to receive the messages.

    Use the same connection details as the producer.

    Starting a local Kafka consumer

    kafka-console-consumer.sh \
    --bootstrap-server my-cluster-kafka-listener1-bootstrap-streams-kafka.apps.ci-ln-50kcyvt-72292.origin-ci-int-gce.dev.rhcloud.com:443 \
    --consumer-property security.protocol=SSL \
    --consumer-property ssl.truststore.password=password \
    --consumer-property ssl.truststore.location=client.truststore.jks \
    --topic my-topic --from-beginning
    Copy to Clipboard Toggle word wrap

  6. Confirm that you see the incoming messages in the consumer console.
  7. Press Crtl+C to exit the Kafka console producer and consumer.
Red Hat logoGithubredditYoutubeTwitter

詳細情報

試用、購入および販売

コミュニティー

Red Hat ドキュメントについて

Red Hat をお使いのお客様が、信頼できるコンテンツが含まれている製品やサービスを活用することで、イノベーションを行い、目標を達成できるようにします。 最新の更新を見る.

多様性を受け入れるオープンソースの強化

Red Hat では、コード、ドキュメント、Web プロパティーにおける配慮に欠ける用語の置き換えに取り組んでいます。このような変更は、段階的に実施される予定です。詳細情報: Red Hat ブログ.

会社概要

Red Hat は、企業がコアとなるデータセンターからネットワークエッジに至るまで、各種プラットフォームや環境全体で作業を簡素化できるように、強化されたソリューションを提供しています。

Theme

© 2026 Red Hat
トップに戻る