Chapter 10. AWS Redshift Sink
Send data to an AWS Redshift Database.
10.1. Authentication methods Copy linkLink copied to clipboard!
In this Kamelet you can avoid using explicit static credentials by specifying the useDefaultCredentialsProvider option and set it to true.
The order of evaluation for Default Credentials Provider is the following:
-
Java system properties -
aws.accessKeyIdandaws.secretKey. -
Environment variables -
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEY. - Web Identity Token from AWS STS.
- The shared credentials and config files.
-
Amazon ECS container credentials - loaded from the Amazon ECS if the environment variable
AWS_CONTAINER_CREDENTIALS_RELATIVE_URIis set. - Amazon EC2 Instance profile credentials.
You can also use the Profile Credentials Provider, by setting the useProfileCredentialsProvider option to true and profileCredentialsName to the profile name.
Only one of access key/secret key or default credentials provider could be used
For more information, see the AWS credentials documentation
10.2. Expected Data format for sink Copy linkLink copied to clipboard!
The Kamelet expects a JSON-formatted body. Use key:value pairs to map the JSON fields and parameters. For example, here is a query:
'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'
Here is example input for the example query:
'{ "username":"oscerd", "city":"Rome"}'
10.3. Configuration Options Copy linkLink copied to clipboard!
The following table summarizes the configuration options available for the aws-redshift-sink Kamelet:
| Property | Name | Description | Type | Default | Example |
|---|---|---|---|---|---|
| databaseName * | Database Name | The name of the AWS RedShift Database. | string | ||
| password * | Password | The password to access a secured AWS Redshift Database. | string | ||
| query * | Query | The query to execute against the AWS Redshift Database. | string | INSERT INTO accounts (username,city) VALUES (:#username,:#city) | |
| serverName * | Server Name | The server name for the data source. | string | localhost | |
| username * | Username | The username to access a secured AWS Redshift Database. | string | ||
| serverPort | Server Port | The server port for the AWS RedShi data source. | string | 5439 |
* = Fields marked with an asterisk are mandatory.
10.4. Dependencies Copy linkLink copied to clipboard!
10.4.1. Quarkus dependencies Copy linkLink copied to clipboard!
<dependencies>
<dependency>
<groupId>org.apache.camel.kamelets</groupId>
<artifactId>camel-kamelets-utils</artifactId>
<version>4.8.5</version>
</dependency>
<dependency>
<groupId>org.apache.camel.quarkus</groupId>
<artifactId>camel-quarkus-aws-redshift</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.quarkus</groupId>
<artifactId>camel-quarkus-jackson</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.quarkus</groupId>
<artifactId>camel-quarkus-kamelet</artifactId>
</dependency>
<dependency>
<groupId>org.apache.camel.quarkus</groupId>
<artifactId>camel-quarkus-sql</artifactId>
</dependency>
</dependencies>
10.5. Usage Copy linkLink copied to clipboard!
10.5.1. Camel JBang usage Copy linkLink copied to clipboard!
10.5.1.1. Prerequisites for JBang Copy linkLink copied to clipboard!
- Install JBang.
You have executed the following command:
jbang app install camel@apache/camel
10.5.1.2. Running a route with JBang Copy linkLink copied to clipboard!
Suppose you have a file named route.yaml with this content:
- route:
from:
uri: "kamelet:timer-source"
parameters:
period: 10000
message: 'test'
steps:
- to:
uri: "kamelet:log-sink"
You can now run it directly through the following command.
camel run route.yaml
10.5.2. Knative Sink Copy linkLink copied to clipboard!
You can use the aws-redshift-sink Kamelet as a Knative sink by binding it to a Knative object.
aws-redshift-sink-binding.yaml
apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
name: aws-redshift-sink-binding
spec:
source:
ref:
kind: Channel
apiVersion: messaging.knative.dev/v1
name: mychannel
sink:
ref:
kind: Kamelet
apiVersion: camel.apache.org/v1
name: aws-redshift-sink
properties:
databaseName: "The Database Name"
password: "The Password"
query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
serverName: "localhost"
username: "The Username"
10.5.3. Kafka Sink Copy linkLink copied to clipboard!
You can use the aws-redshift-sink Kamelet as a Kafka sink by binding it to a Kafka topic.
aws-redshift-sink-binding.yaml
apiVersion: camel.apache.org/v1
kind: Pipe
metadata:
name: aws-redshift-sink-binding
spec:
source:
ref:
kind: KafkaTopic
apiVersion: kafka.strimzi.io/v1beta1
name: my-topic
sink:
ref:
kind: Kamelet
apiVersion: camel.apache.org/v1
name: aws-redshift-sink
properties:
databaseName: "The Database Name"
password: "The Password"
query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
serverName: "localhost"
username: "The Username"