Dieser Inhalt ist in der von Ihnen ausgewählten Sprache nicht verfügbar.
Chapter 14. AWS S3 Streaming upload Sink
Upload data to AWS S3 in streaming upload mode.
14.1. Configuration Options
The following table summarizes the configuration options available for the aws-s3-streaming-upload-sink
Kamelet:
Property | Name | Description | Type | Default | Example |
---|---|---|---|---|---|
accessKey * | Access Key | The access key obtained from AWS. | string | ||
bucketNameOrArn * | Bucket Name | The S3 Bucket name or ARN. | string | ||
keyName * | Key Name | Setting the key name for an element in the bucket through endpoint parameter. In Streaming Upload, with the default configuration, this will be the base for the progressive creation of files. | string | ||
region * | AWS Region | The AWS region to connect to. | string |
| |
secretKey * | Secret Key | The secret key obtained from AWS. | string | ||
autoCreateBucket | Autocreate Bucket | Setting the autocreation of the S3 bucket bucketName. | boolean |
| |
batchMessageNumber | Batch Message Number | The number of messages composing a batch in streaming upload mode | int |
| |
batchSize | Batch Size | The batch size (in bytes) in streaming upload mode | int |
| |
namingStrategy | Naming Strategy | The naming strategy to use in streaming upload mode. There are 2 enums and the value can be one of progressive, random | string |
| |
restartingPolicy | Restarting Policy | The restarting policy to use in streaming upload mode. There are 2 enums and the value can be one of override, lastPart | string |
| |
streamingUploadMode | Streaming Upload Mode | Setting the Streaming Upload Mode | boolean |
|
Fields marked with an asterisk (*) are mandatory.
14.2. Dependencies
At runtime, the aws-s3-streaming-upload-sink
Kamelet relies upon the presence of the following dependencies:
- camel:aws2-s3
- camel:kamelet
14.3. Usage
This section describes how you can use the aws-s3-streaming-upload-sink
.
14.3.1. Knative Sink
You can use the aws-s3-streaming-upload-sink
Kamelet as a Knative sink by binding it to a Knative object.
aws-s3-streaming-upload-sink-binding.yaml
apiVersion: camel.apache.org/v1alpha1 kind: KameletBinding metadata: name: aws-s3-streaming-upload-sink-binding spec: source: ref: kind: Channel apiVersion: messaging.knative.dev/v1 name: mychannel sink: ref: kind: Kamelet apiVersion: camel.apache.org/v1alpha1 name: aws-s3-streaming-upload-sink properties: accessKey: "The Access Key" bucketNameOrArn: "The Bucket Name" keyName: "The Key Name" region: "eu-west-1" secretKey: "The Secret Key"
14.3.1.1. Prerequisite
Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
14.3.1.2. Procedure for using the cluster CLI
-
Save the
aws-s3-streaming-upload-sink-binding.yaml
file to your local drive, and then edit it as needed for your configuration. Run the sink by using the following command:
oc apply -f aws-s3-streaming-upload-sink-binding.yaml
14.3.1.3. Procedure for using the Kamel CLI
Configure and run the sink by using the following command:
kamel bind channel:mychannel aws-s3-streaming-upload-sink -p "sink.accessKey=The Access Key" -p "sink.bucketNameOrArn=The Bucket Name" -p "sink.keyName=The Key Name" -p "sink.region=eu-west-1" -p "sink.secretKey=The Secret Key"
This command creates the KameletBinding in the current namespace on the cluster.
14.3.2. Kafka Sink
You can use the aws-s3-streaming-upload-sink
Kamelet as a Kafka sink by binding it to a Kafka topic.
aws-s3-streaming-upload-sink-binding.yaml
apiVersion: camel.apache.org/v1alpha1 kind: KameletBinding metadata: name: aws-s3-streaming-upload-sink-binding spec: source: ref: kind: KafkaTopic apiVersion: kafka.strimzi.io/v1beta1 name: my-topic sink: ref: kind: Kamelet apiVersion: camel.apache.org/v1alpha1 name: aws-s3-streaming-upload-sink properties: accessKey: "The Access Key" bucketNameOrArn: "The Bucket Name" keyName: "The Key Name" region: "eu-west-1" secretKey: "The Secret Key"
14.3.2.1. Prerequisites
Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic
in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
14.3.2.2. Procedure for using the cluster CLI
-
Save the
aws-s3-streaming-upload-sink-binding.yaml
file to your local drive, and then edit it as needed for your configuration. Run the sink by using the following command:
oc apply -f aws-s3-streaming-upload-sink-binding.yaml
14.3.2.3. Procedure for using the Kamel CLI
Configure and run the sink by using the following command:
kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic aws-s3-streaming-upload-sink -p "sink.accessKey=The Access Key" -p "sink.bucketNameOrArn=The Bucket Name" -p "sink.keyName=The Key Name" -p "sink.region=eu-west-1" -p "sink.secretKey=The Secret Key"
This command creates the KameletBinding in the current namespace on the cluster.
14.4. Kamelet source file
https://github.com/openshift-integration/kamelet-catalog/aws-s3-streaming-upload-sink.kamelet.yaml