Chapter 41. PostgreSQL Sink


Send data to a PostgreSQL Database.

This Kamelet expects a JSON as body. The mapping between the JSON fields and parameters is done by key, so if you have the following query:

'INSERT INTO accounts (username,city) VALUES (:#username,:#city)'

The Kamelet needs to receive as input something like:

'{ "username":"oscerd", "city":"Rome"}'

41.1. Configuration Options

The following table summarizes the configuration options available for the postgresql-sink Kamelet:

PropertyNameDescriptionTypeDefaultExample

databaseName *

Database Name

The Database Name we are pointing

string

  

password *

Password

The password to use for accessing a secured PostgreSQL Database

string

  

query *

Query

The Query to execute against the PostgreSQL Database

string

 

"INSERT INTO accounts (username,city) VALUES (:#username,:#city)"

serverName *

Server Name

Server Name for the data source

string

 

"localhost"

username *

Username

The username to use for accessing a secured PostgreSQL Database

string

  

serverPort

Server Port

Server Port for the data source

string

5432

 
Note

Fields marked with an asterisk (*) are mandatory.

41.2. Dependencies

At runtime, the postgresql-sink Kamelet relies upon the presence of the following dependencies:

  • camel:jackson
  • camel:kamelet
  • camel:sql
  • mvn:org.postgresql:postgresql
  • mvn:org.apache.commons:commons-dbcp2:2.7.0

41.3. Usage

This section describes how you can use the postgresql-sink.

41.3.1. Knative Sink

You can use the postgresql-sink Kamelet as a Knative sink by binding it to a Knative object.

postgresql-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: postgresql-sink-binding
spec:
  source:
    ref:
      kind: Channel
      apiVersion: messaging.knative.dev/v1
      name: mychannel
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: postgresql-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

41.3.1.1. Prerequisite

Make sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

41.3.1.2. Procedure for using the cluster CLI

  1. Save the postgresql-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f postgresql-sink-binding.yaml

41.3.1.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind channel:mychannel postgresql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

41.3.2. Kafka Sink

You can use the postgresql-sink Kamelet as a Kafka sink by binding it to a Kafka topic.

postgresql-sink-binding.yaml

apiVersion: camel.apache.org/v1alpha1
kind: KameletBinding
metadata:
  name: postgresql-sink-binding
spec:
  source:
    ref:
      kind: KafkaTopic
      apiVersion: kafka.strimzi.io/v1beta1
      name: my-topic
  sink:
    ref:
      kind: Kamelet
      apiVersion: camel.apache.org/v1alpha1
      name: postgresql-sink
    properties:
      databaseName: "The Database Name"
      password: "The Password"
      query: "INSERT INTO accounts (username,city) VALUES (:#username,:#city)"
      serverName: "localhost"
      username: "The Username"

41.3.2.1. Prerequisites

Ensure that you’ve installed the AMQ Streams operator in your OpenShift cluster and created a topic named my-topic in the current namespace. Make also sure you have "Red Hat Integration - Camel K" installed into the OpenShift cluster you’re connected to.

41.3.2.2. Procedure for using the cluster CLI

  1. Save the postgresql-sink-binding.yaml file to your local drive, and then edit it as needed for your configuration.
  2. Run the sink by using the following command:

    oc apply -f postgresql-sink-binding.yaml

41.3.2.3. Procedure for using the Kamel CLI

Configure and run the sink by using the following command:

kamel bind kafka.strimzi.io/v1beta1:KafkaTopic:my-topic postgresql-sink -p "sink.databaseName=The Database Name" -p "sink.password=The Password" -p "sink.query=INSERT INTO accounts (username,city) VALUES (:#username,:#city)" -p "sink.serverName=localhost" -p "sink.username=The Username"

This command creates the KameletBinding in the current namespace on the cluster.

41.4. Kamelet source file

https://github.com/openshift-integration/kamelet-catalog/blob/kamelet-catalog-1.6/postgresql-sink.kamelet.yaml

Red Hat logoGithubRedditYoutubeTwitter

Learn

Try, buy, & sell

Communities

About Red Hat Documentation

We help Red Hat users innovate and achieve their goals with our products and services with content they can trust.

Making open source more inclusive

Red Hat is committed to replacing problematic language in our code, documentation, and web properties. For more details, see the Red Hat Blog.

About Red Hat

We deliver hardened solutions that make it easier for enterprises to work across platforms and environments, from the core datacenter to the network edge.

© 2024 Red Hat, Inc.