Este contenido no está disponible en el idioma seleccionado.

Chapter 45. Kafka Source


Receive data from Kafka topics.

45.1. Configuration Options

The following table summarizes the configuration options available for the kafka-source Kamelet:

PropertyNameDescriptionTypeDefaultExample

topic *

Topic Names

Comma separated list of Kafka topic names

string

  

bootstrapServers *

Brokers

Comma separated list of Kafka Broker URLs

string

  

securityProtocol

Security Protocol

Protocol used to communicate with brokers. SASL_PLAINTEXT, PLAINTEXT, SASL_SSL and SSL are supported

string

"SASL_SSL"

 

saslMechanism

SASL Mechanism

The Simple Authentication and Security Layer (SASL) Mechanism used.

string

"PLAIN"

 

user *

Username

Username to authenticate to Kafka

string

  

password *

Password

Password to authenticate to kafka

string

  

autoCommitEnable

Auto Commit Enable

If true, periodically commit to ZooKeeper the offset of messages already fetched by the consumer.

boolean

true

 

allowManualCommit

Allow Manual Commit

Whether to allow doing manual commits

boolean

false

 

autoOffsetReset

Auto Offset Reset

What to do when there is no initial offset. There are 3 enums and the value can be one of latest, earliest, none

string

"latest"

 

pollOnError

Poll On Error Behavior

What to do if kafka threw an exception while polling for new messages. There are 5 enums and the value can be one of DISCARD, ERROR_HANDLER, RECONNECT, RETRY, STOP

string

"ERROR_HANDLER"

 

deserializeHeaders

Automatically Deserialize Headers

When enabled the Kamelet source will deserialize all message headers to String representation. The default is false.

boolean

true

 

consumerGroup

Consumer Group

A string that uniquely identifies the group of consumers to which this source belongs

string

 

"my-group-id"

Note

Fields marked with an asterisk (*) are mandatory.

45.2. Dependencies

At runtime, the `kafka-source Kamelet relies upon the presence of the following dependencies:

  • camel:kafka
  • camel:kamelet
  • camel:core

45.3. Usage

This section describes how you can use the kafka-source.

45.3.1. Knative Source

You can use the kafka-source Kamelet as a Knative source by binding it to a Knative object.

kafka-source-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: kafka-source-binding
spec:
  source:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: kafka-source
    properties:
      bootstrapServers: "The Brokers"
      password: "The Password"
      topic: "The Topic Names"
      user: "The Username"
  sink:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel

45.3.1.1. Prerequisite

Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

45.3.1.2. Procedure for using the cluster CLI

  1. Save the kafka-source-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the source by using the following command:

    oc apply -f kafka-source-binding.yaml

45.3.1.3. Procedure for using the Kamel CLI

Configure and run the source by using the following command:

kamel bind kafka-source -p "source.bootstrapServers=The Brokers" -p "source.password=The Password" -p "source.topic=The Topic Names" -p "source.user=The Username" channel:mychannel

This command creates the KameletBinding in the current namespace on the cluster.

45.3.2. Kafka Source

You can use the kafka-source Kamelet as a Kafka source by binding it to a Kafka topic.

kafka-source-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: kafka-source-binding
spec:
  source:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: kafka-source
    properties:
      bootstrapServers: "The Brokers"
      password: "The Password"
      topic: "The Topic Names"
      user: "The Username"
  sink:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic

45.3.2.1. Prerequisites

Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

45.3.2.2. Procedure for using the cluster CLI

  1. Save the kafka-source-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the source by using the following command:

    oc apply -f kafka-source-binding.yaml

45.3.2.3. Procedure for using the Kamel CLI

Configure and run the source by using the following command:

kamel bind kafka-source -p "source.bootstrapServers=The Brokers" -p "source.password=The Password" -p "source.topic=The Topic Names" -p "source.user=The Username" kafka.strimzi.io/v1beta1:KafkaTopic:my-topic

This command creates the KameletBinding in the current namespace on the cluster.

45.4. Kamelet source file

https://github.com/openshift-integration/kamelet-catalog/kafka-source.kamelet.yaml

Red Hat logoGithubRedditYoutubeTwitter

Aprender

Pruebe, compre y venda

Comunidades

Acerca de la documentación de Red Hat

Ayudamos a los usuarios de Red Hat a innovar y alcanzar sus objetivos con nuestros productos y servicios con contenido en el que pueden confiar.

Hacer que el código abierto sea más inclusivo

Red Hat se compromete a reemplazar el lenguaje problemático en nuestro código, documentación y propiedades web. Para más detalles, consulte el Blog de Red Hat.

Acerca de Red Hat

Ofrecemos soluciones reforzadas que facilitan a las empresas trabajar en plataformas y entornos, desde el centro de datos central hasta el perímetro de la red.

© 2024 Red Hat, Inc.