此内容没有您所选择的语言版本。
Chapter 33. Kafka Sink
Send data to Kafka topics.
The Kamelet is able to understand the following headers to be set:
- 
					key/ce-key: as message key
- 
					partition-key/ce-partitionkey: as message partition key
Both the headers are optional.
33.1. Configuration Options
				The following table summarizes the configuration options available for the kafka-sink Kamelet:
			
| Property | Name | Description | Type | Default | Example | 
|---|---|---|---|---|---|
| bootstrapServers * | Brokers | Comma separated list of Kafka Broker URLs | string | ||
| password * | Password | Password to authenticate to kafka | string | ||
| topic * | Topic Names | Comma separated list of Kafka topic names | string | ||
| user * | Username | Username to authenticate to Kafka | string | ||
| saslMechanism | SASL Mechanism | The Simple Authentication and Security Layer (SASL) Mechanism used. | string | 
								 | |
| securityProtocol | Security Protocol | Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported | string | 
								 | 
Fields marked with an asterisk (*) are mandatory.
33.2. Dependencies
At runtime, the `kafka-sink Kamelet relies upon the presence of the following dependencies:
- camel:kafka
- camel:kamelet
33.3. Usage
				This section describes how you can use the kafka-sink.
			
33.3.1. Knative Sink
					You can use the kafka-sink Kamelet as a Knative sink by binding it to a Knative object.
				
kafka-sink-binding.yaml
33.3.1.1. Prerequisite
Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
33.3.1.2. Procedure for using the cluster CLI
- 
								Save the kafka-sink-binding.yamlfile to your local drive, and then edit it as needed for your configuration.
- Run the sink by using the following command: - oc apply -f kafka-sink-binding.yaml - oc apply -f kafka-sink-binding.yaml- Copy to Clipboard Copied! - Toggle word wrap Toggle overflow 
33.3.1.3. Procedure for using the Kamel CLI
Configure and run the sink by using the following command:
kamel bind channel:mychannel kafka-sink -p "sink.bootstrapServers=The Brokers" -p "sink.password=The Password" -p "sink.topic=The Topic Names" -p "sink.user=The Username"
kamel bind channel:mychannel kafka-sink -p "sink.bootstrapServers=The Brokers" -p "sink.password=The Password" -p "sink.topic=The Topic Names" -p "sink.user=The Username"This command creates the KameletBinding in the current namespace on the cluster.
33.3.2. Kafka Sink
					You can use the kafka-sink Kamelet as a Kafka sink by binding it to a Kafka topic.
				
kafka-sink-binding.yaml
33.3.2.1. Prerequisites
						Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
					
33.3.2.2. Procedure for using the cluster CLI
- 
								Save the kafka-sink-binding.yamlfile to your local drive, and then edit it as needed for your configuration.
- Run the sink by using the following command: - oc apply -f kafka-sink-binding.yaml - oc apply -f kafka-sink-binding.yaml- Copy to Clipboard Copied! - Toggle word wrap Toggle overflow 
33.3.2.3. Procedure for using the Kamel CLI
Configure and run the sink by using the following command:
kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic kafka-sink -p "sink.bootstrapServers=The Brokers" -p "sink.password=The Password" -p "sink.topic=The Topic Names" -p "sink.user=The Username"
kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic kafka-sink -p "sink.bootstrapServers=The Brokers" -p "sink.password=The Password" -p "sink.topic=The Topic Names" -p "sink.user=The Username"This command creates the KameletBinding in the current namespace on the cluster.