此内容没有您所选择的语言版本。

Chapter 33. Kafka Sink


Send data to Kafka topics.

The Kamelet is able to understand the following headers to be set:

  • key / ce-key: as message key
  • partition-key / ce-partitionkey: as message partition key

Both the headers are optional.

33.1. Configuration Options

The following table summarizes the configuration options available for the kafka-sink Kamelet:

PropertyNameDescriptionTypeDefaultExample

bootstrapServers *

Brokers

Comma separated list of Kafka Broker URLs

string

  

password *

Password

Password to authenticate to kafka

string

  

topic *

Topic Names

Comma separated list of Kafka topic names

string

  

user *

Username

Username to authenticate to Kafka

string

  

saslMechanism

SASL Mechanism

The Simple Authentication and Security Layer (SASL) Mechanism used.

string

"PLAIN"

 

securityProtocol

Security Protocol

Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported

string

"SASL_SSL"

 
Note

Fields marked with an asterisk (*) are mandatory.

33.2. Dependencies

At runtime, the `kafka-sink Kamelet relies upon the presence of the following dependencies:

  • camel:kafka
  • camel:kamelet

33.3. Usage

This section describes how you can use the kafka-sink.

33.3.1. Knative Sink

You can use the kafka-sink Kamelet as a Knative sink by binding it to a Knative object.

kafka-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: kafka-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: kafka-sink
    properties:
      bootstrapServers: "The Brokers"
      password: "The Password"
      topic: "The Topic Names"
      user: "The Username"

33.3.1.1. Prerequisite

Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

33.3.1.2. Procedure for using the cluster CLI

  1. Save the kafka-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f kafka-sink-binding.yaml

33.3.1.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind channel:mychannel kafka-sink -p "sink.bootstrapServers=The Brokers" -p "sink.password=The Password" -p "sink.topic=The Topic Names" -p "sink.user=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

33.3.2. Kafka Sink

You can use the kafka-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

kafka-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: kafka-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: kafka-sink
    properties:
      bootstrapServers: "The Brokers"
      password: "The Password"
      topic: "The Topic Names"
      user: "The Username"

33.3.2.1. Prerequisites

Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

33.3.2.2. Procedure for using the cluster CLI

  1. Save the kafka-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f kafka-sink-binding.yaml

33.3.2.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic kafka-sink -p "sink.bootstrapServers=The Brokers" -p "sink.password=The Password" -p "sink.topic=The Topic Names" -p "sink.user=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

33.4. Kamelet source file

https://github.com/openshift-integration/kamelet-catalog/blob/kamelet-catalog-1.6/kafka-sink.kamelet.yaml

Red Hat logoGithubRedditYoutubeTwitter

学习

尝试、购买和销售

社区

关于红帽文档

通过我们的产品和服务,以及可以信赖的内容,帮助红帽用户创新并实现他们的目标。

让开源更具包容性

红帽致力于替换我们的代码、文档和 Web 属性中存在问题的语言。欲了解更多详情,请参阅红帽博客.

關於紅帽

我们提供强化的解决方案,使企业能够更轻松地跨平台和环境(从核心数据中心到网络边缘)工作。

© 2024 Red Hat, Inc.