Ce contenu n'est pas disponible dans la langue sélectionnée.

Chapter 7. AWS Redshift Sink


Send data to an AWS Redshift Database.

This Kamelet expects a JSON as body. The mapping between the JSON fields and parameters is done by key, so if you have the following query:

'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'

The Kamelet needs to receive as input something like:

'{ "username":"oscerd", "city":"Rome"}'

7.1. Configuration Options

The following table summarizes the configuration options available for the aws-redshift-sink Kamelet:

PropertyNameDescriptionTypeDefaultExample

databaseName *

Database Name

The Database Name we are pointing

string

  

password *

Password

The password to use for accessing a secured AWS Redshift Database

string

  

query *

Query

The Query to execute against the AWS Redshift Database

string

 

"INSERT INTO accounts (username,city) VALUES (:#username,:#city)"

serverName *

Server Name

Server Name for the data source

string

 

"localhost"

username *

Username

The username to use for accessing a secured AWS Redshift Database

string

  

serverPort

Server Port

Server Port for the data source

string

5439

 
Note

Fields marked with an asterisk (*) are mandatory.

7.2. Dependencies

At runtime, the aws-redshift-sink Kamelet relies upon the presence of the following dependencies:

  • camel:jackson
  • camel:kamelet
  • camel:sql
  • mvn:com.amazon.redshift:redshift-jdbc42:2.1.0.5
  • mvn:org.apache.commons:commons-dbcp2:2.7.0

7.3. Usage

This section describes how you can use the aws-redshift-sink.

7.3.1. Knative Sink

You can use the aws-redshift-sink Kamelet as a Knative sink by binding it to a Knative object.

aws-redshift-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: aws-redshift-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: aws-redshift-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

7.3.1.1. Prerequisite

Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

7.3.1.2. Procedure for using the cluster CLI

  1. Save the aws-redshift-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f aws-redshift-sink-binding.yaml

7.3.1.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind channel:mychannel aws-redshift-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

7.3.2. Kafka Sink

You can use the aws-redshift-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

aws-redshift-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: aws-redshift-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: aws-redshift-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

7.3.2.1. Prerequisites

Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

7.3.2.2. Procedure for using the cluster CLI

  1. Save the aws-redshift-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f aws-redshift-sink-binding.yaml

7.3.2.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic aws-redshift-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

7.4. Kamelet source file

https://github.com/openshift-integration/kamelet-catalog/aws-redshift-sink.kamelet.yaml

Red Hat logoGithubRedditYoutubeTwitter

Apprendre

Essayez, achetez et vendez

Communautés

À propos de la documentation Red Hat

Nous aidons les utilisateurs de Red Hat à innover et à atteindre leurs objectifs grâce à nos produits et services avec un contenu auquel ils peuvent faire confiance.

Rendre l’open source plus inclusif

Red Hat s'engage à remplacer le langage problématique dans notre code, notre documentation et nos propriétés Web. Pour plus de détails, consultez leBlog Red Hat.

À propos de Red Hat

Nous proposons des solutions renforcées qui facilitent le travail des entreprises sur plusieurs plates-formes et environnements, du centre de données central à la périphérie du réseau.

© 2024 Red Hat, Inc.