PostgreSQL

PostgreSQL, or Postgres, is an open source relational database management system that uses and extends the SQL language. It is free to use, highly extensible, and tries to conform with the SQL standard.

This document walks you through setting up Postgres to use as a source data Data Store in DeltaStream.

Note In DeltaStream, in CDC pipelines you can use Postgres only as a source.

Setting up PostgreSQL

Prerequisites

  1. Create a user in the PostgreSQL instance (see PostgreSQL documentation).

Adding PostgreSQL as a DeltaStream Data Store

  1. Open DeltaStream. In the lefthand navigation, click Resources ( ). The Resources page displays, with the Data Stores tab active.

  2. Click + Add Data Store, and from the list that displays click PostgresSQL. The Add Data Store window displays, with Postgres-specific fields you must complete.

    Postgres Store Details
  3. Enter the following information:

    • Store Name – A name to identify your DeltaStream data store (See Data Store).

    • Store Type – POSTGRESQL.

    • URI – URI for the PostgreSQL database with /<database_name> appended. For example, given a postgres URI of my.postgresql.uri and an open port on the database of 5432, to connect DeltaStream to the demo database the URI would display as: postgresql://my.postgresql.uri:5432/demo

    • Username – Username associated with the PostgreSQL database user DeltaStream should assume.

    • Password – The password associated with the username.

  4. Click Add to create and save the data store.

Note For instructions on creating the store using DSQL, see CREATE STORE.

Inspect the PostgreSQL Data Store

  1. In the lefthand navigation, click Resources ( ). This displays a list of the existing data stores.

  2. Click your PostgresSQL data store (in this case, Postgres_Test_Store). The Postgres data store page opens with the Schemas tab active. A list displays of the existing schemas in your PostgreSQL database.

  3. (Optional) Create a new schema. To do this:

    • Click + Add Schema. When the Add Schema window opens, enter a name for the new schema and then click Add. Your new schema displays in the entities list.

      Adding a Postgres data Store Schema
  4. To view the tables in a schema, click a schema name.

  5. To view a sample of rows from that table, click a table in a schema and then click Print.

    Postgres Schema Details

Process PostgreSQL CDC Data and Sink to Kafka

To follow the next few steps, you must already have a PostgreSQL data store labeled psql_store. You also must have a Kafka data store labeled kafka_store. Define a DeltaStream stream as your source data from PostgreSQL. Then write a query to process this data and sink it to a Kafka topic.

Note For more details, see PostgreSQL.

Defining a DeltaStream Stream on a PostgreSQL Table

In this step, you create a stream called pageviews_cdc that is backed by data in a PostgreSQL table. This stream represents change data capture (CDC) events from the PostgreSQL table.

Note DeltaStream uses Debezium to capture changes in a source relation table. To learn more about how CDC works with DeltaStream, see Change Data Capture (CDC).

First, print the data for your source, which is the pageviews PostgreSQL table. To print sample rows from the table in DeltaStream, inspect your data store and navigate to the table you wish to print. (For more details, see PostgreSQL).

Below is an example of how to create a stream on your pageviews data. The fields match the Debezium standard; any insert, delete, or update to the pageviews table becomes an event for your pageviews_cdc stream.

CREATE STREAM pageviews_cdc(
  op VARCHAR,
  ts_ms BIGINT,
  `before` STRUCT<viewtime BIGINT, userid VARCHAR, pageid VARCHAR>, 
  `after`  STRUCT<viewtime BIGINT, userid VARCHAR, pageid VARCHAR>, 
  `source` STRUCT<db VARCHAR, `table` VARCHAR, `lsn` BIGINT>)
WITH (
  'store'='psql_store', 
  'value.format'='json',
  'postgresql.db.name'='demo',
  'postgresql.schema.name'='public',
  'postgresql.table.name'='pageviews');

Write a CSAS (CREATE STREAM AS SELECT) Query to Sink Data into Kafka

  1. In the lefthand navigation, click Workspace ( ).

  2. In the SQL pane of your workspace, write the CREATE STREAM AS SELECT (CSAS) query to ingest from pageviews_cdc and output to a new stream labeled pageviews_cdc_sink. To represent a feed of upsert events, this query filters for records whose op field is CREATE or UPDATE.

CREATE STREAM pageviews_cdc_sink WITH (
  'store' = 'kafka_store',
  'topic' = 'pageviews_cdc_sink',
  'topic.partitions' = 1,
  'topic.replicas' = 3) AS
SELECT
  *
FROM pageviews_cdc WITH ('postgresql.slot.name'='ds_cdc_demo')
WHERE op = 'c' OR op = 'u';
  1. Click Run.

  2. In the lefthand navigation, click Queries ( ) to view existing queries, including the query from step 2, above. Important It can take a small amount of time for the query to transition into the Running state. Refresh you screen occasionally until you see the query transition into the Running state.

  3. Verify that the query is properly working. To do this, write an interactive SELECT query.

Verifying a Query in Postgres

Last updated