此内容没有您所选择的语言版本。
Chapter 40. MySQL Sink
Send data to a MySQL Database.
This Kamelet expects a JSON as body. The mapping between the JSON fields and parameters is done by key, so if you have the following query:
'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'
The Kamelet needs to receive as input something like:
'{ "username":"oscerd", "city":"Rome"}'
40.1. Configuration Options 复制链接链接已复制到粘贴板!
				The following table summarizes the configuration options available for the mysql-sink Kamelet:
			
| Property | Name | Description | Type | Default | Example | 
|---|---|---|---|---|---|
|   databaseName *  |   Database Name  |   The Database Name we are pointing  |   string  | ||
|   password *  |   Password  |   The password to use for accessing a secured MySQL Database  |   string  | ||
|   query *  |   Query  |   The Query to execute against the MySQL Database  |   string  |   
								  | |
|   serverName *  |   Server Name  |   Server Name for the data source  |   string  |   
								  | |
|   username *  |   Username  |   The username to use for accessing a secured MySQL Database  |   string  | ||
|   serverPort  |   Server Port  |   Server Port for the data source  |   string  |   
								  | 
Fields marked with an asterisk (*) are mandatory.
40.2. Dependencies 复制链接链接已复制到粘贴板!
				At runtime, the mysql-sink Kamelet relies upon the presence of the following dependencies:
			
- camel:jackson
 - camel:kamelet
 - camel:sql
 - mvn:org.apache.commons:commons-dbcp2:2.7.0
 - mvn:mysql:mysql-connector-java
 
40.3. Usage 复制链接链接已复制到粘贴板!
				This section describes how you can use the mysql-sink.
			
40.3.1. Knative Sink 复制链接链接已复制到粘贴板!
					You can use the mysql-sink Kamelet as a Knative sink by binding it to a Knative object.
				
mysql-sink-binding.yaml
40.3.1.1. Prerequisite 复制链接链接已复制到粘贴板!
Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
40.3.1.2. Procedure for using the cluster CLI 复制链接链接已复制到粘贴板!
- 
								Save the 
mysql-sink-binding.yamlfile to your local drive, and then edit it as needed for your configuration. Run the sink by using the following command:
oc apply -f mysql-sink-binding.yaml
oc apply -f mysql-sink-binding.yamlCopy to Clipboard Copied! Toggle word wrap Toggle overflow 
40.3.1.3. Procedure for using the Kamel CLI 复制链接链接已复制到粘贴板!
Configure and run the sink by using the following command:
kamel bind channel:mychannel mysql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"
kamel bind channel:mychannel mysql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"
This command creates the KameletBinding in the current namespace on the cluster.
40.3.2. Kafka Sink 复制链接链接已复制到粘贴板!
					You can use the mysql-sink Kamelet as a Kafka sink by binding it to a Kafka topic.
				
mysql-sink-binding.yaml
40.3.2.1. Prerequisites 复制链接链接已复制到粘贴板!
						Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.
					
40.3.2.2. Procedure for using the cluster CLI 复制链接链接已复制到粘贴板!
- 
								Save the 
mysql-sink-binding.yamlfile to your local drive, and then edit it as needed for your configuration. Run the sink by using the following command:
oc apply -f mysql-sink-binding.yaml
oc apply -f mysql-sink-binding.yamlCopy to Clipboard Copied! Toggle word wrap Toggle overflow 
40.3.2.3. Procedure for using the Kamel CLI 复制链接链接已复制到粘贴板!
Configure and run the sink by using the following command:
kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic mysql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"
kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic mysql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"
This command creates the KameletBinding in the current namespace on the cluster.