Ce contenu n'est pas disponible dans la langue sélectionnée.

Chapter 54. PostgreSQL Sink


Send data to a PostgreSQL Database.

This Kamelet expects a JSON as body. The mapping between the JSON fields and parameters is done by key, so if you have the following query:

'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'

The Kamelet needs to receive as input something like:

'{ "username":"oscerd", "city":"Rome"}'

54.1. Configuration Options

The following table summarizes the configuration options available for the postgresql-sink Kamelet:

PropertyNameDescriptionTypeDefaultExample

databaseName *

Database Name

The Database Name we are pointing

string

  

password *

Password

The password to use for accessing a secured PostgreSQL Database

string

  

query *

Query

The Query to execute against the PostgreSQL Database

string

 

"INSERT INTO accounts (username,city) VALUES (:#username,:#city)"

serverName *

Server Name

Server Name for the data source

string

 

"localhost"

username *

Username

The username to use for accessing a secured PostgreSQL Database

string

  

serverPort

Server Port

Server Port for the data source

string

5432

 
Note

Fields marked with an asterisk (*) are mandatory.

54.2. Dependencies

At runtime, the postgresql-sink Kamelet relies upon the presence of the following dependencies:

  • camel:jackson
  • camel:kamelet
  • camel:sql
  • mvn:org.postgresql:postgresql
  • mvn:org.apache.commons:commons-dbcp2:2.7.0.redhat-00001

54.3. Usage

This section describes how you can use the postgresql-sink.

54.3.1. Knative Sink

You can use the postgresql-sink Kamelet as a Knative sink by binding it to a Knative object.

postgresql-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: postgresql-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: postgresql-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

54.3.1.1. Prerequisite

Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

54.3.1.2. Procedure for using the cluster CLI

  1. Save the postgresql-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f postgresql-sink-binding.yaml

54.3.1.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind channel:mychannel postgresql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

54.3.2. Kafka Sink

You can use the postgresql-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

postgresql-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: postgresql-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: postgresql-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

54.3.2.1. Prerequisites

Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

54.3.2.2. Procedure for using the cluster CLI

  1. Save the postgresql-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f postgresql-sink-binding.yaml

54.3.2.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic postgresql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

54.4. Kamelet source file

https://github.com/openshift-integration/kamelet-catalog/postgresql-sink.kamelet.yaml

Red Hat logoGithubRedditYoutubeTwitter

Apprendre

Essayez, achetez et vendez

Communautés

À propos de la documentation Red Hat

Nous aidons les utilisateurs de Red Hat à innover et à atteindre leurs objectifs grâce à nos produits et services avec un contenu auquel ils peuvent faire confiance.

Rendre l’open source plus inclusif

Red Hat s'engage à remplacer le langage problématique dans notre code, notre documentation et nos propriétés Web. Pour plus de détails, consultez leBlog Red Hat.

À propos de Red Hat

Nous proposons des solutions renforcées qui facilitent le travail des entreprises sur plusieurs plates-formes et environnements, du centre de données central à la périphérie du réseau.

© 2024 Red Hat, Inc.