# CREATE STORE

## Syntax <a href="#synopsis" id="synopsis"></a>

```sql
CREATE STORE
    store_name
WITH (store_parameter = value [, ...]);
```

## Description <a href="#description" id="description"></a>

DeltaStream processes streaming data that is stored in streaming stores such as Apache Kafka and Amazon Kinesis. The first step toward accessing this data is to connect to such data stores.  You do this using the `CREATE STORE` statement.

{% hint style="info" %}
**Notes**  &#x20;

Currently DeltaStream supports Kafka (Confluent Cloud, Amazon MSK, RedPanda, and more) and Amazon Kinesis. Support for additional streaming stores such as Google Pub/Sub and Apache Pulsar are coming soon.\
\
DeltaStream also provides non-streaming store support for ClickHouse, DataBricks, Iceberg AWS Glue Catalog, Iceberg REST Catalog, S3, and PostgreSQL. \
\
Each non-streaming store has a different set of limitations, listed below.
{% endhint %}

Only a [role](https://docs.deltastream.io/overview/core-concepts/access-control#_role) with [`CREATE_STORE`](https://docs.deltastream.io/overview/core-concepts/access-control#privilege) privilege can create a store.

### Arguments

#### store\_name

Name of the store to define. If the name is case sensitive you must wrap it in double quotes; otherwise the system uses the lower case name.

#### WITH (store\_parameter = value \[, …​ ])

This clause specifies [#store\_parameters](#store_parameters "mention").

### Store Parameters <a href="#store_parameters" id="store_parameters"></a>

<table><thead><tr><th width="387">Parameter Name</th><th>Description</th></tr></thead><tbody><tr><td><code>type</code></td><td><p>Specifies the store type.</p><p><br><strong>Required:</strong> Yes</p><p><strong>Type:</strong> <code>STORE_TYPE</code></p><p><strong>Valid values:</strong> <code>KAFKA</code>, <code>KINESIS</code>, <code>SNOWFLAKE</code>, <code>DATABRICKS</code>, <code>POSTGRESQL</code></p></td></tr><tr><td><code>uris</code></td><td><p>List of comma-separated <code>host:port</code> URIs to connect to the store.<br></p><p><strong>Required:</strong> Yes, unless specified in <code>properties.file</code>.<br><strong>Type:</strong> String</p></td></tr><tr><td><code>tls.disabled</code></td><td><p>Optional. Specifies if the store should be accessed over TLS.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> <code>FALSE</code></p><p><strong>Type:</strong> Boolean</p><p><strong>Valid values:</strong> <code>TRUE</code> or <code>FALSE</code></p></td></tr><tr><td><code>tls.verify_server_hostname</code></td><td><p>Specifies if the server CNAME should be validated against the certificate.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> <code>TRUE</code><br><strong>Type:</strong> Boolean<br><strong>Valid values:</strong> <code>TRUE</code> or <code>FALSE</code></p></td></tr><tr><td><code>tls.ca_cert_file</code></td><td><p>Path to a CA certificate file in PEM format. Preface the path with <code>@</code> for the file to be uploaded to the server.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> Public CA chains.<br><strong>Type:</strong> String<br><strong>Valid values:</strong> Path to a SSL certificate in PEM format</p></td></tr><tr><td><code>tls.cipher_suites</code></td><td><p>Comma-separated list of cipher suites to use when establishing a TLS connection.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> [] (all supported cipher suites are enabled)<br><strong>Type:</strong> List<br><strong>Valid values:</strong> Full cipher suite names describing algorithm content</p></td></tr><tr><td><code>tls.protocols</code></td><td><p>Comma-separated list TLS protocol versions to use while establishing a TLS connection.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> <code>TLSv1.2,TLSv1.1,TLSv1</code><br><strong>Type:</strong> List<br><strong>Valid values:</strong> TLS protocols with version</p></td></tr><tr><td><code>schema_registry.name</code></td><td><p>Name of a schema registry to associate with the store. You must first create a schema registry using the <a data-mention href="create-schema_registry">create-schema_registry</a> DDL statement. Only one schema registry can be associated with a store.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid values:</strong> See <a data-mention href="../command/list-schema_registries">list-schema_registries</a></p></td></tr><tr><td><code>properties.file</code></td><td><p>The file path to a .yaml file containing any store parameter. Preface the path with <code>@</code> for the file to be uploaded to the server.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid values:</strong> File path in current user's filesystem</p></td></tr></tbody></table>

### **ClickHouse-Specific Parameters**

Parameters to be used if `type` is `CLICKHOUSE`:

| Parameter Name        | Description                                                                                                                                                                                                                |
| --------------------- | -------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `clickhouse.username` | <p>Username to connect to the database instance specified with the store's <code>uris</code> parameter.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</p> |
| `clickhouse.password` | <p>Password to connect to the database instance using the store's <code>username</code> parameter.<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p>   |

### **Databricks-Specific Parameters**

Parameters to be used if `type` is `DATABRICKS`:

<table><thead><tr><th width="390">Parameter Name</th><th>Description</th></tr></thead><tbody><tr><td><code>databricks.app_token</code></td><td>Databricks personal access token used when authenticating with a Databricks workspace.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>databricks.warehouse_id</code></td><td><p>The identifier for a Databricks SQL warehouse belonging to a Databricks workspace. This warehouse is used to create and query tables in Databricks.<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr><tr><td><code>databricks.warehouse_port</code></td><td><p>The port for a Databricks SQL warehouse belonging to a Databricks workspace.<br><br><strong>Required:</strong> No</p><p><strong>Default value:</strong> 443<br><strong>Type:</strong> Integer</p></td></tr><tr><td><code>aws.access_key_id</code></td><td><p>AWS access key ID used for writing data to S3.<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr><tr><td><code>aws.secret_access_key</code></td><td><p>AWS secret access key used for writing data to S3.</p><p><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr><tr><td><code>databricks.cloud.s3.bucket</code></td><td>The AWS S3 bucket that <a data-mention href="../query/create-table-as">create-table-as</a> queries will write data to.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>databricks.cloud.region</code></td><td><p>The cloud region that the <code>databricks.cloud.s3.bucket</code> belongs to.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</p><p><strong>Valid values:</strong></p><ul><li><code>AWS us-east-1</code></li><li><code>AWS us-east-2</code></li><li><code>AWS us-west-1</code></li><li><code>AWS us-west-2</code></li></ul></td></tr></tbody></table>

### **Iceberg REST Catalog-Specific Parameters**

Parameters to be used if `type` is `ICEBERG REST`:

| Parameter Name               | Description                                                                                                                                                               |
| ---------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `iceberg.rest.client_id`     | <p>Iceberg client\_id<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> <code>NONE</code><br><strong>Type:</strong> <code>STRING</code></p>     |
| `iceberg.rest.client_secret` | <p>Iceberg client\_secret<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> <code>NONE</code><br><strong>Type:</strong> <code>STRING</code></p> |
| `iceberg.rest.client_scope`  | <p>Iceberg client\_scope<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> <code>NONE</code><br><strong>Type:</strong> <code>STRING</code></p>  |
| `iceberg.catalog.id`         | <p>Iceberg catalog name<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> <code>NONE</code><br><strong>Type:</strong> <code>STRING</code></p>   |

### **Kafka-Specific Parameters**

Parameters to be used if `type` is `KAFKA`:

<table><thead><tr><th width="384">Parameter Name</th><th>Description</th></tr></thead><tbody><tr><td><code>kafka.sasl.hash_function</code></td><td><p>SASL hash function to use when authenticating with Apache Kafka brokers.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> <code>NONE</code>.<br><strong>Type:</strong> <code>HASH_FUNCTION</code><br><strong>Valid values:</strong> <code>NONE</code>, <code>PLAIN</code>, <code>SHA256</code>, <code>SHA512</code>, and <code>AWS_MSK_IAM</code></p></td></tr><tr><td><code>kafka.sasl.username</code></td><td><p>Username to use when authenticating with Apache Kafka brokers.<br></p><p><strong>Required:</strong> Yes, if <code>kafka.sasl.hash_function</code> is not <code>NONE</code> or <code>AWS_MSK_IAM</code></p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr><tr><td><code>kafka.sasl.password</code></td><td><p>Password to use when authenticating with Apache Kafka brokers.<br></p><p><strong>Required:</strong> Yes, if <code>kafka.sasl.hash_function</code> is not <code>NONE</code> or <code>AWS_MSK_IAM</code></p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr><tr><td><code>kafka.msk.aws_region</code></td><td><p>AWS region to use when authenticating with MSK.</p><p><strong>Required:</strong> Yes, if <code>kafka.sasl.hash_function</code> is <code>AWS_MSK_IAM</code><br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Example:</strong> <code>us-east-1</code></p></td></tr><tr><td><code>kafka.msk.iam_role_arn</code></td><td><p>AWS IAM role ARN to use when authenticating with MSK.</p><p><strong>Required:</strong> Yes, if <code>kafka.sasl.hash_function</code> is <code>AWS_MSK_IAM</code><br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Example:</strong> <code>arn:aws:iam::123456789012:role/example-IAM-role</code></p></td></tr><tr><td><code>tls.client.cert_file</code></td><td><p>Path to a client certificate file in PEM format. Preface the path with <code>@</code> for the file to be uploaded to the server.<br></p><p><strong>Required:</strong> Yes, if <code>kafka.sasl.hash_function</code> is <code>SHA256</code> or <code>SHA512</code><br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid value:</strong> Path to a SSL certificate in PEM format</p></td></tr><tr><td><code>tls.client.key_file</code></td><td><p>Path to the client key file in PEM format. Prefix the path with <code>@</code> for the file to be uploaded to the server.<br></p><p><strong>Required:</strong> Yes, if <code>kafka.sasl.hash_function</code> is <code>SHA256</code> or <code>SHA512</code><br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid value:</strong> Path to a SSL certificate in PEM format</p></td></tr></tbody></table>

### **Kinesis-Specific Parameters**

Parameters to be used if `type` is `KINESIS`:

<table><thead><tr><th width="388">Parameter Name</th><th>Description</th></tr></thead><tbody><tr><td><code>kinesis.iam_role_arn</code></td><td><p>AWS IAM role ARN to use when authenticating with an Amazon Kinesis service.</p><p><strong>Required:</strong> Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials<br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Example:</strong> <code>arn:aws:iam::123456789012:role/example-IAM-role</code></p></td></tr><tr><td><code>kinesis.access_key_id</code></td><td>AWS IAM access key to use when authenticating with an Amazon Kinesis service.<br><br><strong>Required:</strong> Yes, if authenticating with the Amazon Kinesis Service using static AWS credentials<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>kinesis.secret_access_key</code></td><td>AWS IAM secret access key to use when authenticating with an Amazon Kinesis service.<br><br><strong>Required:</strong> Yes, if authenticating with the Amazon Kinesis Service using static AWS credentials<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr></tbody></table>

### PostgreSQL-Specific Parameters

Parameters to be used if `type` is `PostgreSQL`:

| Parameter Name         | Description                                                                                                                                                                                                                                                                                                                                                                               |
| ---------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| `postgres.username`    | <p>Username to connect to the database instance specified with the store's <code>uris</code> parameter.<br><br><strong>Required:</strong> Yes, if using user/pass authentication<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</p>                                                                                                                             |
| `postgres.password`    | <p>Password to connect to the database instance using the store's <code>username</code> parameter.<br><br><strong>Required:</strong> Yes, if using user/pass authentication</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p>                                                                                                                               |
| `tls.client.cert_file` | <p>Path to a client certificate file in PEM format. Preface the path with <code>@</code> for the file to be uploaded to the server.<br></p><p><strong>Required:</strong> Yes, if using TLS client certificate- based authentication<br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid value:</strong> Path to a SSL certificate in PEM format</p> |
| `tls.client.key_file`  | <p>Path to the client key file in PEM format. Prefix the path with <code>@</code> for the file to be uploaded to the server.<br></p><p><strong>Required:</strong> Yes, if using TLS client certificate based  authentication<br><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid value:</strong> Path to a SSL certificate in PEM format</p>        |

{% hint style="warning" %}
To secure the Postgres connection, DeltaStream requires you to set `tls.verify_server_hostname`to `TRUE` and to set `tls.disabled` to `FALSE` .
{% endhint %}

#### Using Neon-Hosted PostgreSQL

If your PostgreSQL instance is hosted on [Neon](https://neon.com/), you must make the following adjustments to the Store URI at creation time for it to work properly:

1. **Explicitly include the port number in the URI.** Neon uses the default PostgreSQL port 5432, but it must be specified explicitly. Check [here](https://neon.com/docs/connect/query-with-psql-editor#what-port-does-neon-use) for details.
2. **Add the `endpoint` as a connection option** **in the Store URI.**

**Example**

```
// Neon connection URI (as provided by Neon):
postgresql://<neon-endpoint>.c-3.us-east-1.aws.neon.tech/<db-name>

// Store URI used when creating the Store
postgresql://<neon-endpoint>.c-3.us-east-1.aws.neon.tech:5432/<db-name>?options=endpoint%3D<neon-endpoint>
```

### S3-Specific Parameters

For an S3 store, the `uris` parameter must specify the S3 URI of a bucket or folder whose contents form the data source for the Store. Other parameters to be used if `type` is `S3` are:

| Parameter Name          | Description                                                                                                                                                                                                                    |
| ----------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| `aws.iam_role_arn`      | <p>AWS IAM role ARN to use when authenticating with S3.<br><br><strong>Required:</strong> No</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p>                                                   |
| `aws.iam_external_id`   | <p>IAM External ID.<br><br><strong>Required:</strong> Yes, if <code>aws.iam\_role\_arn</code> is specified.</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p>                                    |
| `aws.access_key_id`     | <p>AWS IAM role ARN to use when authenticating with S3.<br><br><strong>Required:</strong> Yes, if authenticating using static AWS credentials.</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p> |
| `aws.secret_access_key` | <p>IAM External ID.<br><br><strong>Required:</strong> Yes, if authenticating using static AWS credentials.</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p>                                     |

### Setting a Trust Relationship when Using Your IAM Role ARN and External ID

Before you can access a data store using `aws.iam_role_arn` and `aws.iam_external_id`, you must update the role provided in `aws.iam_role_arn` for a trust relationship to include a policy for a DeltaStream Platform Role.  The DeltaStream Platform Role attempts to assume the provided role.\
\
The trust relationship JSON resembles the following:

```
{
    "Sid": "DeltaStreamTrustPolicy",
    "Effect": "Allow",
     "Principal": {
        "AWS": [
            "arn:aws:iam::DELTASTREAM_DATAPLANE_AWS_ACCOUNT:role/DELTASTREAM_CROSS_ACCOUNT_IAM_ROLE_ARN",            
        ]
    },
    "Action": "sts:AssumeRole",
    "Condition": {
        "StringEquals": {
            "sts:ExternalId": "EXTERNAL_ID"
        }
    }
}
```

In the trust policy you must substitute following parameters:<br>

* <pre data-overflow="wrap"><code>DELTASTREAM_DATAPLANE_AWS_ACCOUNT - Reach out to DeltaStream for the dataplane AWS Account
  </code></pre>
* <pre data-overflow="wrap"><code>DELTASTREAM_CROSS_ACCOUNT_IAM_ROLE_ARN - Reach out to DeltaStream for dataplane-specific IAM Role ARN
  </code></pre>
* ```
  EXTERNAL_ID - This should match the provided aws.iam_external_id
  ```

For further details on how to configure a trust relationship, please refer to the AWS guide: <https://docs.aws.amazon.com/directoryservice/latest/admin-guide/edit_trust.html>

### Snowflake-Specific Parameters

Parameters to be used if `type` is `SNOWFLAKE`:

<table><thead><tr><th width="393">Parameter Name</th><th>Description</th></tr></thead><tbody><tr><td><code>snowflake.account_id</code></td><td>Snowflake account identifier assigned to the Snowflake account.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>snowflake.cloud.region</code></td><td><p>Name of the Snowflake cloud region in which the account resources operate.<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String<br><strong>Valid values:</strong></p><ul><li><code>AWS us-east-1</code></li><li><code>AWS us-east-2</code></li><li><code>AWS us-west-1</code></li><li><code>AWS us-west-2</code></li></ul></td></tr><tr><td><code>snowflake.role_name</code></td><td><p>Access control role to use for the store operations after connecting to Snowflake.<br><br><strong>Required:</strong> Yes</p><p><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr><tr><td><code>snowflake.username</code></td><td>User login name for the Snowflake account.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>snowflake.warehouse_name</code></td><td>Warehouse name to use for queries and other store operations that require compute resources.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>snowflake.client.key_file</code></td><td>Path to the Snowflake account's private key in PEM format. Prefix the path with <code>@</code> for the file to be uploaded to the server.<br><br><strong>Required:</strong> Yes<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</td></tr><tr><td><code>snowflake.client.key_passphrase</code></td><td><p>Passphrase for decrypting the Snowflake account's private key.<br></p><p><strong>Required:</strong> No<br><strong>Default value:</strong> None<br><strong>Type:</strong> String</p></td></tr></tbody></table>

## Examples

#### Create a Kafka store with credentials

The following creates a new Kafka store with name `my_kafka_store`:

```sql
CREATE STORE 
    my_kafka_store 
WITH ( 
    'type' = KAFKA, 
    'uris' = 'kafka.broker1.url:9092,kafka.broker2.url:9092', 
    'tls.ca_cert_file' = '@/certs/us-east-1/self-signed-kafka-ca.crt'
);
```

#### Create an MSK store with IAM credentials

The following creates a new Kafka store with name `my_kafka_store`:

```sql
CREATE STORE 
    msk_iam_store 
WITH ( 
    'type' = KAFKA, 
    'uris' = 'b-1.abc.abc.c10.kafka.us-east-1.amazonaws.com:9098,b-2.abc.abc.c10.kafka.us-east-1.amazonaws.com:9098,b-3.abc.abc.c10.kafka.us-east-1.amazonaws.com:9098', 
    'kafka.sasl.hash_function' = AWS_MSK_IAM,
    'kafka.msk.aws_region'='us-east-1',
    'kafka.msk.iam_role_arn'='arn:aws:iam::123456789012:role/example-IAM-role'
);
```

#### Create a Kafka store with credentials from a file

The following creates a new Kafka store with name `MyKafkaStore++`:

```sql
CREATE STORE 
    "MyKafkaStore++" 
WITH ( 
    'type' = KAFKA,
    'properties.file' = '@/User/user1/deltastream/kafka_store/properties.yaml'
);
```

```sh
$ cat /User/user1/deltastream/kafka_store/properties.yaml
uris: "http://uri1,http://uri2"
kafka.sasl.hash_function: PLAIN
kafka.sasl.username: "ABCDEFGH12345678"
kafka.sasl.password: "kafkasaslpassword"
```

#### Create a Kinesis store with IAM credentials

The following statement creates a new Kinesis store with name `my_kinesis_store`:

```sql
CREATE STORE 
    my_kinesis_store 
WITH ( 
    'type' = KINESIS, 
    'uris' = 'https://url.to.kinesis.aws:4566', 
    'kinesis.iam_role_arn' = 'arn:aws:iam::123456789012:role/example-IAM-role'
);
```

#### Create a Kinesis store with static credentials

The following statement creates a new Kinesis store with name `my_kinesis_store`:

```sql
CREATE STORE 
    my_kinesis_store 
WITH ( 
    'type' = KINESIS, 
    'uris' = 'https://url.to.kinesis.aws:4566', 
    'kinesis.access_key_id' = 'testkey', 
    'kinesis.secret_access_key' = 'testsecret'
);
```

#### Create a Kafka store with a schema registry

The following statement creates a new Kafka store with a schema registry named `sr`. Note that the store name is case-sensitive and thus has quotes around it:

```sql
CREATE STORE
    "kafkaStoreWithSR"
WITH (
    'type' = KAFKA, 
    'uris' = 'kafka.broker1.url:9092,kafka.broker2.url:9092', 
    'schema_registry.name' = sr
);
```

#### Create a Confluent Kafka store with credentials

The following creates a new Confluent Cloud Kafka store with the case-sensitive name `ConfluentCloudKafkaStore`:

```sql
CREATE STORE "ConfluentCloudKafkaStore" 
WITH ( 
    'type' = KAFKA,
    'uris' = 'abc-12345.us-east-1.aws.confluent.cloud:9092',
    'kafka.sasl.hash_function' = PLAIN,
    'kafka.sasl.username' = 'credentials_username',
    'kafka.sasl.password' = 'credentials_password'
);
```

#### Create a Snowflake store

```sql
CREATE STORE sf 
WITH ( 
    'type' = SNOWFLAKE,
    'uris' = 'https://my-account.snowflakecomputing.com',
    'snowflake.account_id' = 'my-account',
    'snowflake.role_name' = 'ACCOUNTADMIN',
    'snowflake.username' = 'STREAMING_USER',
    'snowflake.warehouse_name' = 'COMPUTE_WH',
    'snowflake.client.key_file' = '@/path/to/pk/my_account_rsa.p8'
);
```

#### Create a Snowflake store with client key passphrase

```sql
CREATE STORE sf 
WITH ( 
    'type' = SNOWFLAKE,
    'uris' = 'https://my-account.snowflakecomputing.com',
    'snowflake.account_id' = 'my-account',
    'snowflake.role_name' = 'ACCOUNTADMIN',
    'snowflake.username' = 'STREAMING_USER',
    'snowflake.warehouse_name' = 'COMPUTE_WH',
    'snowflake.client.key_file' = '@/path/to/pk/my_account_rsa.p8',
    'properties.file' = '@/path/to/deltastream/snowflake_store/properties.yaml'
;
```

```sh
$ cat /path/to/deltastream/snowflake_store/properties.yaml
snowflake.client.key_passphrase: "my$account$$key$$$phrase"
```

#### Create a Databricks store

```sql
CREATE STORE databricks_store WITH (
  'type' = DATABRICKS,
  'uris' = 'https://dbc-12345678-1234.cloud.databricks.com', 
  'databricks.app_token' = 'dapiabcdefghijklmnopqrstuvw123456789', 
  'databricks.warehouse_id' = 'abcdefgh1234567', 
  'aws.access_key_id' = 'AWS_ACCESS_KEY', 
  'aws.secret_access_key' = 'AWS_SECRET_ACCESS_KEY', 
  'databricks.cloud.s3.bucket' = 'mybucket', 
  'databricks.cloud.region' = 'AWS us-west-2'
);

```

#### Create a PostgreSQL store

```sql
CREATE STORE ps_store WITH (
  'type' = POSTGRESQL,
  'uris' = 'postgresql://mystore.com:5432/demo', 
  'postgres.username' = 'user',
  'postgres.password' = 'password'
);
```

#### Create a ClickHouse store

```sql
CREATE STORE ch WITH (
'type' = CLICKHOUSE, 
'uris' = 'jdbc:clickhouse://uri:8443',
'clickhouse.username' = 'my_user',
'clickhouse.password' = 'my_pass'
);
```

#### Create an Iceberg AWS Glue Catalog Store with IAM credentials

```sql
CREATE STORE iceberg_glue_store WITH (
  'type' = ICEBERG_GLUE,
  'uris' = 'https://mybucket.s3.amazonaws.com/', 
  'aws.iam_role_arn'='arn:aws:iam::123456789012:role/example-IAM-role',
  'aws.iam_external_id'='EXTERNAL ID'
  'aws.region'='us-east-1',
  'iceberg.warehouse.default_path'='s3://bucket/path',
  'iceberg.catalog.id'='mycatalog'
);
```

#### Create an Iceberg REST Catalog Store using Snowflake OpenCatalog (Apache Polaris)

```sql
CREATE STORE iceberg_rest_store WITH (
  'type' = ICEBERG_REST,
  'uris' = 'https://abcd-xyz.snowflakecomputing.com/polaris/api/catalog', 
  'iceberg.catalog.id' = 'my-catalog',
  'iceberg.rest.client_id' = 'CLIENT_ID', 
  'iceberg.rest.client_secret' = 'CLIENT_SECRET',
  'iceberg.rest.scope' = 'SCOPE'
);
```

#### Create an S3 store

```sql
CREATE STORE s3_store WITH (
  'type' = S3,
  'uris' = 'https://mybucket.s3.amazonaws.com/', 
  'aws.access_key_id'='AWS_ACCESS_KEY', 
  'aws.secret_access_key'='AWS_SECRET_ACCESS_KEY'
);
```

#### Create an S3 Store with IAM credentials

```sql
CREATE STORE s3_store WITH (
  'type' = S3,
  'uris' = 'https://mybucket.s3.amazonaws.com/', 
  'aws.iam_role_arn'='arn:aws:iam::123456789012:role/example-IAM-role',
  'aws.iam_external_id'='EXTERNAL ID'
);
```

**Notes**

All parameters above are required
