CREATE STORE
Syntax
Description
DeltaStream processes streaming data that is stored in streaming stores such as Apache Kafka and Amazon Kinesis. To access such data, the first step is to connect to such data stores. This is done using CREATE STORE
statement. It defines a new Store with connection details to connect to a remote data source. Currently DeltaStream supports Kafka (Confluent Cloud, Amazon MSK, RedPanda, etc.) and Amazon Kinesis. Support for other streaming stores such as Google Pub/Sub and Apache Pulsar are coming soon.
Stores can only be created by a Role with CREATE_STORE
privilege.
Arguments
store_name
Name of the Store to define. For case-sensitive names, the name must be wrapped in double quotes, otherwise, the lowercased name will be used.
WITH (store_parameter = value [, … ])
This clause specifies Store Parameters.
Store Parameters
Parameter Name | Description |
---|---|
| Specifies the Store type. Required: Yes Type: Valid values: |
| Specifies the region of the Store. In order to improve latency and reduce data transfer costs, the region should be the same cloud and region that the physical Store is running in. Required: Yes, unless specified in |
| List of comma separated Required: Yes, unless specified in |
| Optional. Specifies if the store should be accessed over TLS. Required: No
Default value: Type: Boolean Valid values: |
| Specifies if the server CNAME should be validated against the certificate. Required: No
Default value: |
| Path to a CA certificate file in PEM format. Prefix the path with Required: No Default value: Public CA chains. Type: String Valid values: Path to a SSL certificate in PEM format |
| Comma separated list of cipher suites to use when establishing a TLS connection. Required: No Default value: [] (all supported cipher suites are enabled) Type: List Valid values: Full cipher suite names describing algorithm content |
| Comma separated list TLS protocol versions to use while establishing a TLS connection. Required: No
Default value: |
| Name of a Schema Registry to associate with the store. A Schema Registry must first be created using the CREATE SCHEMA_REGISTRY DDL statement. Only one Schema Registry can be associated with a store. Required: No Default value: None Type: String Valid values: See LIST SCHEMA_REGISTRIES |
| The file path to a yaml file containing any store parameter. Prefix the path with Required: No Default value: None Type: String Valid values: File path in current user's filesystem |
Kafka Specific Parameters
Parameters to be used if type
is KAFKA
:
Parameter Name | Description |
---|---|
| SASL hash function to use when authenticating with Apache Kafka brokers. Required: No
Default value: |
| Username to use when authenticating with Apache Kafka brokers. Required: Yes, if Default value: None Type: String |
| Password to use when authenticating with Apache Kafka brokers. Required: Yes, if Default value: None Type: String |
| AWS region to use when authenticating with MSK. Required: Yes, if |
| AWS IAM role ARN to use when authenticating with MSK. Required: Yes, if |
| Path to a client certificate file in PEM format. Prefix the path with Required: Yes, if |
| Path to the client key file in PEM format. Prefix the path with Required: Yes, if |
Kinesis Specific Parameters
Parameters to be used if type
is KINESIS
:
Parameter Name | Description |
---|---|
| AWS IAM role ARN to use when authenticating with an Amazon Kinesis service. Required: Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials
Default value: None
Type: String
Example: |
| AWS IAM access key to use when authenticating with an Amazon Kinesis service. Required: Yes, if authenticating with the Amazon Kinesis Service using static AWS credentials Default value: None Type: String |
| AWS IAM secret access key to use when authenticating with an Amazon Kinesis service. Required: Yes, if authenticating with the Amazon Kinesis Service using static AWS credentials Default value: None Type: String |
Snowflake Specific Parameters
Parameters to be used if type
is SNOWFLAKE
:
Parameter Name | Description |
---|---|
| Snowflake account identifier assigned to the Snowflake account. Required: Yes Default value: None Type: String |
| Snowflake cloud region name, where the account resources operate in. Required: Yes Default value: None Type: String Valid values:
|
| Access control role to use for the Store operations after connecting to Snowflake. Required: Yes Default value: None Type: String |
| User login name for the Snowflake account. Required: Yes Default value: None Type: String |
| Warehouse name to use for queries and other store operations that require compute resources. Required: Yes Default value: None Type: String |
| Path to the Snowflake account's private key in PEM format. Prefix the path with |
| Passphrase for decrypting the Snowflake account's private key. Required: No Default value: None Type: String |
Databricks Specific Parameters
Parameters to be used if type
is DATABRICKS
:
Parameter Name | Description |
---|---|
| Databricks personal access token used when authenticating with a Databricks workspace. Required: Yes Default value: None Type: String |
| The identifier for a Databricks SQL Warehouse belonging to a Databricks workspace. This Warehouse will be used to create and query Tables in Databricks. Required: Yes Default value: None Type: String |
| The port for a Databricks SQL Warehouse belonging to a Databricks workspace. Required: No Default value: 443 Type: Integer |
| AWS access key ID used for writing data to S3. Required: Yes Default value: None Type: String |
| AWS secret access key used for writing data to S3. Required: Yes Default value: None Type: String |
| The AWS S3 bucket that CREATE TABLE AS SELECT queries will write data to. Required: Yes Default value: None Type: String |
| The cloud region that the Valid values:
|
PostgreSQL Specific Parameters
Parameter Name | Description |
---|---|
| Username to connect to the database instance specified with the store's |
| Password to connect to the database instance using the store's Default value: None Type: String |
Examples
Create a Kafka Store with credentials
The following creates a new Kafka store with name my_kafka_store
:
Create an MSK Store with IAM credentials
The following creates a new Kafka store with name my_kafka_store
:
Create a Kafka Store with credentials from a file
The following creates a new Kafka store with name MyKafkaStore++
:
Create a Kinesis Store with IAM credentials
The following statement creates a new Kinesis store with name my_kinesis_store
:
Create a Kinesis Store with static credentials
The following statement creates a new Kinesis store with name my_kinesis_store
:
Create a Kafka Store with a Schema Registry
The following statement creates a new Kafka store with a Schema Registry named sr
. Note that the store name is case-sensitive and thus has quotes around it:
Create a Confluent Kafka Store with credentials
The following creates a new Confluent Cloud Kafka store with the case-sensitive name ConfluentCloudKafkaStore
:
Create a Snowflake Store
Create a Snowflake Store with client key passphrase
Create a Databricks Store
Create a PostgreSQL Store
Last updated