Este contenido no está disponible en el idioma seleccionado.

Chapter 7. AWS Redshift Sink


Send data to an AWS Redshift Database.

This Kamelet expects a JSON as body. The mapping between the JSON fields and parameters is done by key, so if you have the following query:

'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'

The Kamelet needs to receive as input something like:

'{ "username":"oscerd", "city":"Rome"}'

7.1. Configuration Options

The following table summarizes the configuration options available for the aws-redshift-sink Kamelet:

PropertyNameDescriptionTypeDefaultExample

databaseName *

Database Name

The Database Name we are pointing

string

  

password *

Password

The password to use for accessing a secured AWS Redshift Database

string

  

query *

Query

The Query to execute against the AWS Redshift Database

string

 

"INSERT INTO accounts (username,city) VALUES (:#username,:#city)"

serverName *

Server Name

Server Name for the data source

string

 

"localhost"

username *

Username

The username to use for accessing a secured AWS Redshift Database

string

  

serverPort

Server Port

Server Port for the data source

string

5439

 
Note

Fields marked with an asterisk (*) are mandatory.

7.2. Dependencies

At runtime, the aws-redshift-sink Kamelet relies upon the presence of the following dependencies:

  • camel:jackson
  • camel:kamelet
  • camel:sql
  • mvn:com.amazon.redshift:redshift-jdbc42:2.1.0.5
  • mvn:org.apache.commons:commons-dbcp2:2.7.0

7.3. Usage

This section describes how you can use the aws-redshift-sink.

7.3.1. Knative Sink

You can use the aws-redshift-sink Kamelet as a Knative sink by binding it to a Knative object.

aws-redshift-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: aws-redshift-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: aws-redshift-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

7.3.1.1. Prerequisite

Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

7.3.1.2. Procedure for using the cluster CLI

  1. Save the aws-redshift-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f aws-redshift-sink-binding.yaml

7.3.1.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind channel:mychannel aws-redshift-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

7.3.2. Kafka Sink

You can use the aws-redshift-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

aws-redshift-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: aws-redshift-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: aws-redshift-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

7.3.2.1. Prerequisites

Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

7.3.2.2. Procedure for using the cluster CLI

  1. Save the aws-redshift-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f aws-redshift-sink-binding.yaml

7.3.2.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic aws-redshift-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

7.4. Kamelet source file

https://github.com/openshift-integration/kamelet-catalog/aws-redshift-sink.kamelet.yaml

Red Hat logoGithubRedditYoutubeTwitter

Aprender

Pruebe, compre y venda

Comunidades

Acerca de la documentación de Red Hat

Ayudamos a los usuarios de Red Hat a innovar y alcanzar sus objetivos con nuestros productos y servicios con contenido en el que pueden confiar.

Hacer que el código abierto sea más inclusivo

Red Hat se compromete a reemplazar el lenguaje problemático en nuestro código, documentación y propiedades web. Para más detalles, consulte el Blog de Red Hat.

Acerca de Red Hat

Ofrecemos soluciones reforzadas que facilitan a las empresas trabajar en plataformas y entornos, desde el centro de datos central hasta el perímetro de la red.

© 2024 Red Hat, Inc.