LogoLogo
Start Trial
  • Overview
    • What is DeltaStream?
    • Core Concepts
      • Access Control
      • Region
      • SQL
      • Store
      • Database
      • Query
      • Visualizing Data Lineage
      • Function
  • Getting Started
    • Free Trial Quick Start
    • Starting with the Web App
    • Starting with the CLI
  • Tutorials
    • Managing Users and User Roles
      • Inviting Users to an Organization
      • Administering Users in your Organization
      • Using the CLI to Manage User Roles
      • Example: Setting Up Custom Roles for Production and Stage
    • Creating Stores for Streaming Data
    • Using Multiple Stores in Queries
    • Creating Relations to Structure Raw Data
    • Namespacing with Database and Schema
    • Creating and Querying Materialized Views
    • Creating a Function
    • Securing Your Connections to Data Stores
      • Introducing DeltaStream Private Links
      • Creating an AWS Private Link from DeltaStream to your Confluent Kafka Dedicated Cluster
      • Enabling Private Link Connectivity to Confluent Enterprise Cluster and Schema Registry
      • Creating a Private Link from DeltaStream to Amazon MSK
      • Creating a Private Link for RDS Databases
      • Deleting a Private Link
    • Integrations
      • Connecting to Confluent Cloud
      • Databricks
      • PostgreSQL
      • Snowflake
      • WarpStream
    • Serialization
      • Working with ProtoBuf Serialized Data and DeltaStream Descriptors
      • Working with Avro Serialized Data and Schema Registries
      • Configuring Deserialization Error Handling
  • Reference
    • Enterprise Security Integrations
      • Okta SAML Integration
      • Okta SCIM Integration
    • Metrics
      • Prometheus Integration
      • Built-In Metrics
      • Custom Metrics in Functions
    • SQL Syntax
      • Data Formats (Serialization)
        • Serializing with JSON
        • Serializing with Primitive Data Types
        • Serializing with Protobuf
      • Data Types
      • Identifiers and Keywords
      • Command
        • ACCEPT INVITATION
        • CAN I
        • COPY DESCRIPTOR_SOURCE
        • COPY FUNCTION_SOURCE
        • DESCRIBE ENTITY
        • DESCRIBE QUERY
        • DESCRIBE QUERY METRICS
        • DESCRIBE QUERY EVENTS
        • DESCRIBE QUERY STATE
        • DESCRIBE RELATION
        • DESCRIBE RELATION COLUMNS
        • DESCRIBE ROLE
        • DESCRIBE SECURITY INTEGRATION
        • DESCRIBE <statement>
        • DESCRIBE STORE
        • DESCRIBE USER
        • GENERATE COLUMNS
        • GENERATE TEMPLATE
        • GRANT OWNERSHIP
        • GRANT PRIVILEGES
        • GRANT ROLE
        • INVITE USER
        • LIST API_TOKENS
        • LIST DATABASES
        • LIST DESCRIPTORS
        • LIST DESCRIPTOR_SOURCES
        • LIST ENTITIES
        • LIST FUNCTIONS
        • LIST FUNCTION_SOURCES
        • LIST INVITATIONS
        • LIST METRICS INTEGRATIONS
        • LIST ORGANIZATIONS
        • LIST QUERIES
        • LIST REGIONS
        • LIST RELATIONS
        • LIST ROLES
        • LIST SCHEMAS
        • LIST SCHEMA_REGISTRIES
        • LIST SECRETS
        • LIST SECURITY INTEGRATIONS
        • LIST STORES
        • LIST USERS
        • PRINT ENTITY
        • REJECT INVITATION
        • REVOKE INVITATION
        • REVOKE PRIVILEGES
        • REVOKE ROLE
        • SET DEFAULT
        • USE
      • DDL
        • ALTER API_TOKEN
        • ALTER SECURITY INTEGRATION
        • CREATE API_TOKEN
        • CREATE CHANGELOG
        • CREATE DATABASE
        • CREATE DESCRIPTOR_SOURCE
        • CREATE ENTITY
        • CREATE FUNCTION_SOURCE
        • CREATE FUNCTION
        • CREATE INDEX
        • CREATE METRICS INTEGRATION
        • CREATE ORGANIZATION
        • CREATE ROLE
        • CREATE SCHEMA_REGISTRY
        • CREATE SCHEMA
        • CREATE SECRET
        • CREATE SECURITY INTEGRATION
        • CREATE STORE
        • CREATE STREAM
        • CREATE TABLE
        • DROP API_TOKEN
        • DROP CHANGELOG
        • DROP DATABASE
        • DROP DESCRIPTOR_SOURCE
        • DROP ENTITY
        • DROP FUNCTION_SOURCE
        • DROP FUNCTION
        • DROP METRICS INTEGRATION
        • DROP RELATION
        • DROP ROLE
        • DROP SCHEMA
        • DROP SCHEMA_REGISTRY
        • DROP SECRET
        • DROP SECURITY INTEGRATION
        • DROP STORE
        • DROP STREAM
        • DROP USER
        • UPDATE ENTITY
        • UPDATE SCHEMA_REGISTRY
        • UPDATE SECRET
        • UPDATE STORE
      • Query
        • APPLICATION
        • Change Data Capture (CDC)
        • CREATE CHANGELOG AS SELECT
        • CREATE STREAM AS SELECT
        • CREATE TABLE AS SELECT
        • Function
          • Built-in Functions
          • Row Metadata Functions
        • INSERT INTO
        • Materialized View
          • CREATE MATERIALIZED VIEW AS
          • SELECT (FROM MATERIALIZED VIEW)
        • Query Name and Version
        • Resume Query
        • RESTART QUERY
        • SELECT
          • FROM
          • JOIN
          • MATCH_RECOGNIZE
          • WITH (Common Table Expression)
        • TERMINATE QUERY
      • Sandbox
        • START SANDBOX
        • DESCRIBE SANDBOX
        • STOP SANDBOX
      • Row Key Definition
    • Rest API
Powered by GitBook
On this page
  • Before You Begin
  • Creating an Environment in Confluent
  • Creating the Private Link in the DeltaStream CLI
  • Testing the Private Link and Schema Registry Connection
  1. Tutorials
  2. Securing Your Connections to Data Stores

Enabling Private Link Connectivity to Confluent Enterprise Cluster and Schema Registry

PreviousCreating an AWS Private Link from DeltaStream to your Confluent Kafka Dedicated ClusterNextCreating a Private Link from DeltaStream to Amazon MSK

Last updated 3 months ago

Below is the procedure for creating private links for the data you’re streaming via DeltaStream. It includes the configuration needed within your Kafka cluster (in this case, Confluent Enterprise).

Note Currently DeltaStream supports private links only in AWS.

Before You Begin

  • Review .

  • You must have signed up with DeltaStream and created at least one organization. Private links function within the context of an organization; from a logical perspective, you enable your private links within a specific DeltaStream organization.

  • if you don’t already use it. Currently you cannot create private links via the DeltaStream UI.

  • Request DeltaStream Operations to enable private link connectivity for your account.

  • You must have the DeltaStream platform AWS account number that sends private link connectivity requests to your Kafka data stores. to obtain this number.

  • Optionally, work with DeltaStream Operations if you wish to run all your queries (that is, stream processing) within a dedicated AWS dataplane. By default all DeltaStream customer queries run in a shared multi-tenant dataplane; network policies isolate all traffic among multiple customers. This dedicated data plane separates your workload from other DeltaStream customers by using fully-isolated compute and VPC networking resources.

Note There are separate but related instructions for creating private links for a ; ; and .

Creating an Environment in Confluent

  1. From your Confluent Console, navigate to Environments and click Create Environment. The Create Environment window opens. Click Advanced.

  1. The Create Cluster screen displays. Enter a cluster name. Then:

    1. For Cluster Type, click Enterprise.

    2. For Provider and region, click AWS. Then click the Region down arrow and select the region you need.

  1. Scroll down the page for more choices:

    1. In the Uptime SLA section, click 99.9% (if you’re testing; you may prefer 99.99% for production instances).

    2. In the Networking section, click Private.

    3. Leave the network configuration as is for now.

    4. Check to turn on the Resource metadata access slider. This setting enables you to verify your connectivity after you’re done by checking topics coming into your cluster.

  1. Click Launch Cluster. The Cluster details screen displays, indicating you have not yet completed your setup.

  2. In the righthand column, click Create a PrivateLink configuration.

  1. The Add Network Configuration screen displays. Enter the provider and region once again, and enter a network name. Then click Continue. The Enterprise cluster details page displays again.

  2. Click to activate the Network management tab.

  1. Click the network name link. The network details page displays.

  2. Note down the PrivateLink Service ID as $$YOUR_ENDPOINT_SERVICE. You use this variable a few steps later, when you create a private link in the DeltaStream CLI.

  1. Click + Create access point. The Create access point screen displays, overlaid over the network details page.

Note In the Create Access Point screen above, you must enter the step 4 VPC Interface Endpoint ID. To get this ID, in the DeltaStream CLI follow the first two steps of the procedure below, then copy the ID and return to the Create Access Point screen to paste in the ID.

Creating the Private Link in the DeltaStream CLI

This procedure involves building a SQL statement. When you complete and run the statement, DeltaStream processes the link request automatically. Note that the private link is not established until it is accepted or approved by administrators from your organization who are responsible for maintaining Kafka stores.

  1. From the DeltaStream CLI, issue the following SQL command to create a private link for both the enterprise cluster and access to the schema registry. Paste in the endpoint service ID you just copied.

CREATE AWS PRIVATE LINK confluentent
WITH ( 'access_region' =[b]
"AWS us-east-1", 'private_link.target_type' = CONFLUENT_KAFKA,
'private_link.service_name' = '$$YOURENDPOINTSERVICE', 'private_link.hosts' ( '*.useast-
1.aws.private.confluent.cloud:9092' USING PORT 9092 IN '*', '*.useast-[c]
1.aws.private.confluent.cloud:443' USING PORT 443 IN '*') );

  1. Next, verify the status of this private link. To do this, type list AWS PRIVATE LINKS.

Type list AWS PRIVATE LINKS to verify the status of this link.  When the link status changes to ready, copy and store the VPC Endpoint ID created by DeltaStream.
| ID | Name | Target Type |
Service Name | Status | Messages | Vpc Endpoint Id |
Discovery Iam Role Arn | Created At |
Updated At | Deleted At |
+--------------------------------------+--------------
+-----------------
+---------------------------------------------------------+---------
+-----------+------------------------
+--------------------------------------------------------------
+-----------------------------------
+-----------------------------------+-------------+
| b9bb786e-2dac-4275-8194-4f72b424414c | confluentent |
confluent_kafka | com.amazonaws.vpce.us-east-1.vpce-
svc-0b01079b35b08bb30 | ready | Current |
vpce-087b9c93bee5343b0 | arn:aws:iam::792739327446:role/pl-
xg5xq3rnvrbhlamuj5zlijcbjq | 2025-01-21 21:20:24.037 +0000 UTC |
2025-01-21 21:23:59.322 +0000 UTC | <null>
  1. Go back to the Confluent console and return to the Create access point screen. Paste the endpoint service ID you copied earlier into the box in Step 4 of this screen.

  1. Enter a name for this access point.

  2. Click Create access point. The status of the access point displays as provisioning.

  1. Return to the DeltaStream CLI and again type list AWS PRIVATE LINKS. The link should display as READY within 1-3 minutes.

| ID | Name | Target Type |
Service Name | Status | Messages | Vpc Endpoint Id |
Discovery Iam Role Arn | Created At |
Updated At | Deleted At |
+--------------------------------------+--------------
+-----------------
+---------------------------------------------------------+---------
+-----------+------------------------
+--------------------------------------------------------------
+-----------------------------------
+-----------------------------------+-------------+
| b9bb786e-2dac-4275-8194-4f72b424414c | confluentent |
confluent_kafka | com.amazonaws.vpce.us-east-1.vpce-
svc-0b01079b35b08bb30 | ready | Current |
vpce-087b9c93bee5343b0 | arn:aws:iam::792739327446:role/pl-
xg5xq3rnvrbhlamuj5zlijcbjq | 2025-01-21 21:20:24.037 +0000 UTC |
2025-01-21 21:23:59.322 +0000 UTC | <null>

The private link is now ready for you to test the connection.

Testing the Private Link and Schema Registry Connection

  1. Return to the Confluent Cloud environment. Navigate to the enterprise cluster details page, and verify the newly-created cluster is running.

  1. Navigate to the API Keys section of the cluster. Then click Create Key.

  1. In the Select account for API Key screen, click My account.

  1. Click Next and then download and store the newly-created access key file

  2. Open the download key-secret file. It should resemble the following:

=== Confluent Cloud API key ===
API key:
KEY....
API secret:
SECRET....
Resource:
lkc-p3nm1m
Bootstrap server:
lkc-myenterprise-east-1.aws.private.confluent.cloud:9092
  1. Return to the enterprise cluster details page. In the righthand column, toward the bottom, copy the Schema registry private endpoint ID.

  1. Create a schema registry credential. To do this, at the bottom right of the cluster details screen, click + Add Key.

    • This is separate from the Confluent cluster API key you downloaded earlier.

  1. Download the API key and the Secret.

Creating a schema registry endpoint and a new data store to connect to the Confluent enterprise cluster

You do this from the DeltaStream UI.

  1. Create a schema registry from the DeltaStream Web console To do this, open DeltaStream and navigate to the Resources page.

  2. Click to activate the Schema Registries tab. Then click + Add Schema Registry.

  1. When the Add Schema Registry window opens, enter the desired information.

  2. In the Add One Or More URIs To Connect box, paste in the Schema Registry endpoint service ID.

  3. Paste in the schema registry API Key and Secret.

  4. Click Add.

7. Return to the Resources page and verify the schema registry is in the ready status.

8. Click to activate the Stores tab and create a new data store. To do this, click + Add Store. The Add Store window opens. Enter the required information:

  • Type in a name for the store.

  • In the Add One Or More URLs To Connect box, paste the Bootstrap Servers.

  • In the Schema Registry box, paste in the Schema Registry you created in Confluent.

9. Enter the API Key and Secret.

10. Click Add. The new store transitions to the Ready state in 1-2 minutes.

Verifying the connection to the enterprise cluster

To do this, start by adding topics to your new store.

  1. When the Resources page redisplays, click the name of the store you just created. The Store details page displays.

  2. Click Add Topic to create a new topic. The Add Topic window opens. In here:

    1. Enter a name for the topic.

    2. In Number of Partitions box, type 1.

    3. In Number of Replicas box, type 3.

  3. Click Add.

The new topic displays.

  1. Return to the Confluent Cloud dashboard to review the enterprise cluster metadata. To do this:

    • Navigate to the cluster details page, and then click Topics.

The newly-created topic should display on your Confluent Cloud console.

This completes the verification process for your enterprise cluster.

Now verify the schema registry. To do this, you create a changelog in the AVRO file format using any existing relations or streams within DeltaStream.

Introducing Private Links
Download the DeltaStream CLI
Contact DeltaStream support
Confluent Cloud dedicated cluster
AWS Managed Kafka (MSK)
Postgres RDS