Chapter 10. AWS Redshift Sink


Send data to an AWS Redshift Database.

10.1. Authentication methods

In this Kamelet you can avoid using explicit static credentials by specifying the useDefaultCredentialsProvider option and set it to true.

The order of evaluation for Default Credentials Provider is the following:

  • Java system properties - aws.accessKeyId and aws.secretKey.
  • Environment variables - AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.
  • Web Identity Token from AWS STS.
  • The shared credentials and config files.
  • Amazon ECS container credentials - loaded from the Amazon ECS if the environment variable AWS_CONTAINER_CREDENTIALS_RELATIVE_URI is set.
  • Amazon EC2 Instance profile credentials.

You can also use the Profile Credentials Provider, by setting the useProfileCredentialsProvider option to true and profileCredentialsName to the profile name.

Only one of access key/secret key or default credentials provider could be used

For more information, see the AWS credentials documentation

10.2. Expected Data format for sink

The Kamelet expects a JSON-formatted body. Use key:value pairs to map the JSON fields and parameters. For example, here is a query:

'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'

Here is example input for the example query:

'{ "username":"oscerd", "city":"Rome"}'

10.3. Configuration Options

The following table summarizes the configuration options available for the aws-redshift-sink Kamelet:

Expand
PropertyNameDescriptionTypeDefaultExample

databaseName *

Database Name

The name of the AWS RedShift Database.

string

  

password *

Password

The password to access a secured AWS Redshift Database.

string

  

query *

Query

The query to execute against the AWS Redshift Database.

string

 

INSERT INTO accounts (username,city) VALUES (:#username,:#city)

serverName *

Server Name

The server name for the data source.

string

 

localhost

username *

Username

The username to access a secured AWS Redshift Database.

string

  

serverPort

Server Port

The server port for the AWS RedShi data source.

string

5439

 

* = Fields marked with an asterisk are mandatory.

10.4. Dependencies

10.4.1. Quarkus dependencies

<dependencies>
  <dependency>
    <groupId>org.apache.camel.kamelets</groupId>
    <artifactId>camel-kamelets-utils</artifactId>
    <version>4.8.5</version>
  </dependency>
  <dependency>
    <groupId>org.apache.camel.quarkus</groupId>
    <artifactId>camel-quarkus-aws-redshift</artifactId>
  </dependency>
  <dependency>
    <groupId>org.apache.camel.quarkus</groupId>
    <artifactId>camel-quarkus-jackson</artifactId>
  </dependency>
  <dependency>
    <groupId>org.apache.camel.quarkus</groupId>
    <artifactId>camel-quarkus-kamelet</artifactId>
  </dependency>
  <dependency>
    <groupId>org.apache.camel.quarkus</groupId>
    <artifactId>camel-quarkus-sql</artifactId>
  </dependency>
</dependencies>

10.5. Usage

10.5.1. Camel JBang usage

10.5.1.1. Prerequisites for JBang

  • Install JBang.
  • You have executed the following command:

    jbang app install camel@apache/camel

10.5.1.2. Running a route with JBang

Suppose you have a file named route.yaml with this content:

- route:
    from:
      uri: "kamelet:timer-source"
      parameters:
        period: 10000
        message: 'test'
      steps:
        - to:
            uri: "kamelet:log-sink"

You can now run it directly through the following command.

camel run route.yaml

10.5.2. Knative Sink

You can use the aws-redshift-sink Kamelet as a Knative sink by binding it to a Knative object.

aws-redshift-sink-binding.yaml

apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
  name: aws-redshift-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1
      name: aws-redshift-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

10.5.3. Kafka Sink

You can use the aws-redshift-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

aws-redshift-sink-binding.yaml

apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
  name: aws-redshift-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1
      name: aws-redshift-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

10.6. Kamelets source file

https://github.com/jboss-fuse/camel-kamelets/blob/camel-kamelets-4.10.3-branch/kamelets/aws-redshift-sink.kamelet.yaml

Red Hat logoGithubredditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust. Explore our recent updates.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

Theme

© 2026 Red Hat
Back to top