이 콘텐츠는 선택한 언어로 제공되지 않습니다.
Chapter 10. AWS Redshift Sink
Send data to an AWS Redshift Database.
10.1. Authentication methods 링크 복사링크가 클립보드에 복사되었습니다!
In this Kamelet you can avoid using explicit static credentials by specifying the useDefaultCredentialsProvider option and set it to true.
The order of evaluation for Default Credentials Provider is the following:
-
Java system properties -
aws.accessKeyIdandaws.secretKey. -
Environment variables -
AWS_ACCESS_KEY_IDandAWS_SECRET_ACCESS_KEY. - Web Identity Token from AWS STS.
- The shared credentials and config files.
-
Amazon ECS container credentials - loaded from the Amazon ECS if the environment variable
AWS_CONTAINER_CREDENTIALS_RELATIVE_URIis set. - Amazon EC2 Instance profile credentials.
You can also use the Profile Credentials Provider, by setting the useProfileCredentialsProvider option to true and profileCredentialsName to the profile name.
Only one of access key/secret key or default credentials provider could be used
For more information, see the AWS credentials documentation
10.2. Expected Data format for sink 링크 복사링크가 클립보드에 복사되었습니다!
The Kamelet expects a JSON-formatted body. Use key:value pairs to map the JSON fields and parameters. For example, here is a query:
'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'
'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'
Here is example input for the example query:
'{ "username":"oscerd", "city":"Rome"}'
'{ "username":"oscerd", "city":"Rome"}'
10.3. Configuration Options 링크 복사링크가 클립보드에 복사되었습니다!
The following table summarizes the configuration options available for the aws-redshift-sink Kamelet:
| Property | Name | Description | Type | Default | Example |
|---|---|---|---|---|---|
| databaseName * | Database Name | The name of the AWS RedShift Database. | string | ||
| password * | Password | The password to access a secured AWS Redshift Database. | string | ||
| query * | Query | The query to execute against the AWS Redshift Database. | string | INSERT INTO accounts (username,city) VALUES (:#username,:#city) | |
| serverName * | Server Name | The server name for the data source. | string | localhost | |
| username * | Username | The username to access a secured AWS Redshift Database. | string | ||
| serverPort | Server Port | The server port for the AWS RedShi data source. | string | 5439 |
* = Fields marked with an asterisk are mandatory.
10.4. Dependencies 링크 복사링크가 클립보드에 복사되었습니다!
10.4.1. Quarkus dependencies 링크 복사링크가 클립보드에 복사되었습니다!
10.5. Usage 링크 복사링크가 클립보드에 복사되었습니다!
10.5.1. Camel JBang usage 링크 복사링크가 클립보드에 복사되었습니다!
10.5.1.1. Prerequisites for JBang 링크 복사링크가 클립보드에 복사되었습니다!
- Install JBang.
You have executed the following command:
jbang app install camel@apache/camel
jbang app install camel@apache/camelCopy to Clipboard Copied! Toggle word wrap Toggle overflow
10.5.1.2. Running a route with JBang 링크 복사링크가 클립보드에 복사되었습니다!
Suppose you have a file named route.yaml with this content:
You can now run it directly through the following command.
camel run route.yaml
camel run route.yaml
10.5.2. Knative Sink 링크 복사링크가 클립보드에 복사되었습니다!
You can use the aws-redshift-sink Kamelet as a Knative sink by binding it to a Knative object.
aws-redshift-sink-binding.yaml
10.5.3. Kafka Sink 링크 복사링크가 클립보드에 복사되었습니다!
You can use the aws-redshift-sink Kamelet as a Kafka sink by binding it to a Kafka topic.
aws-redshift-sink-binding.yaml