Red Hat Camel K is no longer supported.
As of June 30, 2025, Red Hat build of Camel K has reached End of Life. The suggested replacements is Red Hat build of Apache Camel. For details about moving, see the Camel K to Camel Quarkus migration guide.Chapter 44. Kafka Source
Receive data from Kafka topics.
44.1. Configuration Options Copy linkLink copied to clipboard!
The following table summarizes the configuration options available for the kafka-source
Kamelet:
Property | Name | Description | Type | Default | Example |
---|---|---|---|---|---|
topic * | Topic Names | Comma separated list of Kafka topic names | string | ||
bootstrapServers * | Brokers | Comma separated list of Kafka Broker URLs | string | ||
securityProtocol | Security Protocol | Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported | string |
| |
saslMechanism | SASL Mechanism | The Simple Authentication and Security Layer (SASL) Mechanism used. | string |
| |
user * | Username | Username to authenticate to Kafka | string | ||
password * | Password | Password to authenticate to kafka | string | ||
autoCommitEnable | Auto Commit Enable | If true, periodically commit to ZooKeeper the offset of messages already fetched by the consumer. | boolean |
| |
allowManualCommit | Allow Manual Commit | Whether to allow doing manual commits | boolean |
| |
autoOffsetReset | Auto Offset Reset | What to do when there is no initial offset. There are 3 enums and the value can be one of latest, earliest, none | string |
| |
pollOnError | Poll On Error Behavior | What to do if kafka threw an exception while polling for new messages. There are 5 enums and the value can be one of DISCARD, ERROR_HANDLER, RECONNECT, RETRY, STOP | string |
| |
deserializeHeaders | Automatically Deserialize Headers |
When enabled the Kamelet source will deserialize all message headers to String representation. The default is | boolean |
| |
consumerGroup | Consumer Group | A string that uniquely identifies the group of consumers to which this source belongs | string | "my-group-id" |
Fields marked with an asterisk (*) are mandatory.
44.2. Dependencies Copy linkLink copied to clipboard!
At runtime, the `kafka-source Kamelet relies upon the presence of the following dependencies:
- camel:kafka
- camel:kamelet
- camel:core
44.3. Usage Copy linkLink copied to clipboard!
This section describes how you can use the kafka-source
.
44.3.1. Knative Source Copy linkLink copied to clipboard!
You can use the kafka-source
Kamelet as a Knative source by binding it to a Knative object.
kafka-source-binding.yaml
44.3.1.1. Prerequisite Copy linkLink copied to clipboard!
Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
44.3.1.2. Procedure for using the cluster CLI Copy linkLink copied to clipboard!
-
Save the
kafka-source-binding.yaml
file to your local drive, and then edit it as needed for your configuration. Run the source by using the following command:
oc apply -f kafka-source-binding.yaml
oc apply -f kafka-source-binding.yaml
Copy to Clipboard Copied! Toggle word wrap Toggle overflow
44.3.1.3. Procedure for using the Kamel CLI Copy linkLink copied to clipboard!
Configure and run the source by using the following command:
kamel bind kafka-source -p "source.bootstrapServers=The Brokers" -p "source.password=The Password" -p "source.topic=The Topic Names" -p "source.user=The Username" channel:mychannel
kamel bind kafka-source -p "source.bootstrapServers=The Brokers" -p "source.password=The Password" -p "source.topic=The Topic Names" -p "source.user=The Username" channel:mychannel
This command creates the KameletBinding in the current namespace on the cluster.
44.3.2. Kafka Source Copy linkLink copied to clipboard!
You can use the kafka-source
Kamelet as a Kafka source by binding it to a Kafka topic.
kafka-source-binding.yaml
44.3.2.1. Prerequisites Copy linkLink copied to clipboard!
Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic
in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
44.3.2.2. Procedure for using the cluster CLI Copy linkLink copied to clipboard!
-
Save the
kafka-source-binding.yaml
file to your local drive, and then edit it as needed for your configuration. Run the source by using the following command:
oc apply -f kafka-source-binding.yaml
oc apply -f kafka-source-binding.yaml
Copy to Clipboard Copied! Toggle word wrap Toggle overflow
44.3.2.3. Procedure for using the Kamel CLI Copy linkLink copied to clipboard!
Configure and run the source by using the following command:
kamel bind kafka-source -p "source.bootstrapServers=The Brokers" -p "source.password=The Password" -p "source.topic=The Topic Names" -p "source.user=The Username" kafka.strimzi.io/v1beta1:KafkaTopic:my-topic
kamel bind kafka-source -p "source.bootstrapServers=The Brokers" -p "source.password=The Password" -p "source.topic=The Topic Names" -p "source.user=The Username" kafka.strimzi.io/v1beta1:KafkaTopic:my-topic
This command creates the KameletBinding in the current namespace on the cluster.
44.4. Kamelet source file Copy linkLink copied to clipboard!
https://github.com/openshift-integration/kamelet-catalog/kafka-source.kamelet.yaml