LogoLogo
Start Trial
  • Overview
    • What is DeltaStream?
    • Core Concepts
      • Access Control
      • Compute Pools
      • Data Store
      • Database
      • Function
      • Query
      • SQL
      • Visualizing Data Lineage
  • Getting Started
    • Free Trial Quick Start
    • Starting with the Web App
    • Starting with the CLI
  • How do I...?
    • Create and Manage Data Stores
      • Create Data Stores for Streaming Data
      • Explore Data Store and Topic Details
      • Use Multiple Data Stores in Queries
    • Manage Users and User Roles
      • Inviting Users to an Organization
      • Administering Users in your Organization
      • Using the CLI to Manage User Roles
      • Example: Setting Up Custom Roles for Production and Stage
    • Create DeltaStream Objects to Structure Raw Data
    • Use Namespacing for Organizing Data
    • Create and Query Materialized Views
    • Create a Compute Pool to Work with Iceberg
    • Create a Function
    • Secure my Connection to a Data Store
      • Introducing DeltaStream Private Links
      • Creating an AWS Private Link from DeltaStream to your Confluent Kafka Dedicated Cluster
      • Enabling Private Link Connectivity to Confluent Enterprise Cluster and Schema Registry
      • Creating a Private Link from DeltaStream to Amazon MSK
      • Creating a Private Link for RDS Databases
      • Deleting a Private Link
    • Serialize my Data
      • Working with ProtoBuf Serialized Data and DeltaStream Descriptors
      • Working with Avro Serialized Data and Schema Registries
      • Configuring Deserialization Error Handling
  • Integrations
    • Setting up Data Store Integrations
      • AWS S3
      • ClickHouse
      • Confluent Cloud
      • Databricks
      • Iceberg REST Catalog
      • PostgreSQL
      • Snowflake
      • WarpStream
  • Setting up Enterprise Security Integrations
    • Okta SAML Integration
    • Okta SCIM Integration
  • use cases
    • Using an AWS S3 Store as a Source to Feed an MSK Topic
  • Reference
    • Metrics
      • Prometheus Integration
      • Built-In Metrics
      • Custom Metrics in Functions
    • SQL Syntax
      • Data Formats (Serialization)
        • Serializing with JSON
        • Serializing with Primitive Data Types
        • Serializing with Protobuf
      • Data Types
      • Identifiers and Keywords
      • Command
        • ACCEPT INVITATION
        • CAN I
        • COPY DESCRIPTOR_SOURCE
        • COPY FUNCTION_SOURCE
        • DESCRIBE ENTITY
        • DESCRIBE QUERY
        • DESCRIBE QUERY METRICS
        • DESCRIBE QUERY EVENTS
        • DESCRIBE QUERY STATE
        • DESCRIBE RELATION
        • DESCRIBE RELATION COLUMNS
        • DESCRIBE ROLE
        • DESCRIBE SECURITY INTEGRATION
        • DESCRIBE <statement>
        • DESCRIBE STORE
        • DESCRIBE USER
        • GENERATE COLUMNS
        • GENERATE TEMPLATE
        • GRANT OWNERSHIP
        • GRANT PRIVILEGES
        • GRANT ROLE
        • INVITE USER
        • LIST API_TOKENS
        • LIST COMPUTE_POOLS
        • LIST DATABASES
        • LIST DESCRIPTORS
        • LIST DESCRIPTOR_SOURCES
        • LIST ENTITIES
        • LIST FUNCTIONS
        • LIST FUNCTION_SOURCES
        • LIST INVITATIONS
        • LIST METRICS INTEGRATIONS
        • LIST ORGANIZATIONS
        • LIST QUERIES
        • LIST RELATIONS
        • LIST ROLES
        • LIST SCHEMAS
        • LIST SCHEMA_REGISTRIES
        • LIST SECRETS
        • LIST SECURITY INTEGRATIONS
        • LIST STORES
        • LIST USERS
        • PRINT ENTITY
        • REJECT INVITATION
        • REVOKE INVITATION
        • REVOKE PRIVILEGES
        • REVOKE ROLE
        • SET DEFAULT
        • USE
        • START COMPUTE_POOL
        • STOP COMPUTE_POOL
      • DDL
        • ALTER API_TOKEN
        • ALTER SECURITY INTEGRATION
        • CREATE API_TOKEN
        • CREATE CHANGELOG
        • CREATE COMPUTE_POOL
        • CREATE DATABASE
        • CREATE DESCRIPTOR_SOURCE
        • CREATE ENTITY
        • CREATE FUNCTION_SOURCE
        • CREATE FUNCTION
        • CREATE INDEX
        • CREATE METRICS INTEGRATION
        • CREATE ORGANIZATION
        • CREATE ROLE
        • CREATE SCHEMA_REGISTRY
        • CREATE SCHEMA
        • CREATE SECRET
        • CREATE SECURITY INTEGRATION
        • CREATE STORE
        • CREATE STREAM
        • CREATE TABLE
        • DROP API_TOKEN
        • DROP CHANGELOG
        • DROP COMPUTE_POOL
        • DROP DATABASE
        • DROP DESCRIPTOR_SOURCE
        • DROP ENTITY
        • DROP FUNCTION_SOURCE
        • DROP FUNCTION
        • DROP METRICS INTEGRATION
        • DROP RELATION
        • DROP ROLE
        • DROP SCHEMA
        • DROP SCHEMA_REGISTRY
        • DROP SECRET
        • DROP SECURITY INTEGRATION
        • DROP STORE
        • DROP STREAM
        • DROP USER
        • START/STOP COMPUTE_POOL
        • UPDATE COMPUTE_POOL
        • UPDATE ENTITY
        • UPDATE SCHEMA_REGISTRY
        • UPDATE SECRET
        • UPDATE STORE
      • Query
        • APPLICATION
        • Change Data Capture (CDC)
        • CREATE CHANGELOG AS SELECT
        • CREATE STREAM AS SELECT
        • CREATE TABLE AS SELECT
        • Function
          • Built-in Functions
          • Row Metadata Functions
        • INSERT INTO
        • Materialized View
          • CREATE MATERIALIZED VIEW AS
          • SELECT (FROM MATERIALIZED VIEW)
        • Query Name and Version
        • Resume Query
        • RESTART QUERY
        • SELECT
          • FROM
          • JOIN
          • MATCH_RECOGNIZE
          • WITH (Common Table Expression)
        • TERMINATE QUERY
      • Sandbox
        • START SANDBOX
        • DESCRIBE SANDBOX
        • STOP SANDBOX
      • Row Key Definition
    • DeltaStream OpenAPI
      • Deltastream
      • Models
Powered by GitBook
On this page
  • Syntax
  • Description
  • Arguments
  • Store Parameters
  • ClickHouse-Specific Parameters
  • Databricks-Specific Parameters
  • Iceberg AWS Glue Catalog-Specific Parameters
  • Kafka-Specific Parameters
  • Kinesis-Specific Parameters
  • S3-Specific Parameters
  • Snowflake-Specific Parameters
  • Examples
  1. Reference
  2. SQL Syntax
  3. DDL

UPDATE STORE

PreviousUPDATE SECRETNextQuery

Last updated 3 months ago

Syntax

UPDATE STORE
    store_name
WITH (store_parameter = value [, ...]);

Description

Updates a with new store parameters.

Arguments

store_name

Name of the store to update. If the name is case sensitive you must wrap it in double quotes; otherwise the system uses the lower case name.

WITH (store_parameter = value [, …​ ])

This clause specifies store parameters; see below for more information.

Store Parameters

Parameter Name
Description

type

Specifies the store type.

Required: No

Type: STORE_TYPE

Valid values: KAFKA or KINESIS.

access_region

Specifies the region of the store. To improve latency and reduce data transfer costs, the region should be the same cloud and region in which the physical store is running.

uris

List of comma-separated host:port URIs to connect to the store.

Required: No Type: String

tls.disabled

Specifies if the store should be accessed over TLS. Required: No Default value: FALSE

Type: Boolean

Valid values: TRUE or FALSE

tls.verify_server_hostname

Specifies if the server CNAME should be validated against the certificate. Required: No Default value: TRUE Type: Boolean Valid values: TRUE or FALSE

tls.ca_cert_file

Path to a CA certificate file in PEM format. Required: No Default value: Public CA chains. Type: String

tls.cipher_suites

Comma-separated list of cipher suites to use when establishing a TLS connection. Required: No Default value: [] Type: List

tls.protocols

Comma-separated list TLS protocol versions to use while establishing a TLS connection. Required: No Default value: TLSv1.2,TLSv1.1,TLSv1 Type: List Valid values: TLS protocols with version

schema_registry.name

properties.file

The file path to a .yaml file containing other store parameters. Required: No Default value: None Type: String Valid values: File path in current user's filesystem

ClickHouse-Specific Parameters

Parameters to be used if type is CLICKHOUSE:

Parameter Name
Description

clickhouse.username

Username to connect to the database instance specified with the store's uris parameter. Required: Yes Default value: None Type: String

clickhouse.password

Password to connect to the database instance using the store's username parameter. Required: Yes

Default value: None Type: String

Databricks-Specific Parameters

Parameters to be used if type is DATABRICKS:

Parameter Name
Description

databricks.app_token

Databricks personal access token used when authenticating with a Databricks workspace. Required: No Default value: None Type: String

aws.access_key_id

AWS access key ID used for writing data to S3. Required: Yes

Default value: None Type: String

aws.secret_access_key

AWS secret access key used for writing data to S3.

Required: Yes Default value: None Type: String

Iceberg AWS Glue Catalog-Specific Parameters

ParameterName
Description

aws.iam_role_arn

AWS IAM role ARN to use when authenticating with S3 Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified. Default value: NONE Type: STRING

aws.iam_external_id

IAM External ID Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified. Default value: NONE Type: STRING

aws.access_key_id

AWS IAM role ARN to use when authenticating with S3. Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified. Default value: NONE Type: STRING

aws.secret_access_key

IAM External ID Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified. Default value: NONE Type: STRING

aws.region

AWS region in which the glue catalog resides. Required: No

Default value: NONE Type: STRING

iceberg.warehouse.default_path

Iceberg default warehouse path. Required: No

Default value: NONE Type: STRING

iceberg.catalog.id

Iceberg catalog ID Required: No

Default value: NONE Type: STRING

Kafka-Specific Parameters

Parameters to be used if type is KAFKA:

Parameter Name
Description

kafka.sasl.hash_function

SASL hash function to use when authenticating with Apache Kafka brokers. Required: No Default value: NONE. Type: HASH_FUNCTION Valid values: NONE, PLAIN, SHA256, and SHA512

kafka.sasl.username

Username to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String

kafka.sasl.password

Password to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String

kafka.msk.aws_region

AWS region to use when authenticating with MSK.

Required: Yes, if kafka.sasl.hash_function is AWS_MSK_IAM Default value: None Type: String Example: us-east-1

kafka.msk.iam_role_arn

AWS IAM role ARN to use when authenticating with MSK.

Required: Yes, if kafka.sasl.hash_function is AWS_MSK_IAM Default value: None Type: String Example: arn:aws:iam::123456789012:role/example-IAM-role

tls.client.cert_file

Path to a client certificate file in PEM format. Required: No Default value: None Type: String

tls.client.key_file

Path to the client key file in PEM format. Required: No Default value: None Type: String

Kinesis-Specific Parameters

Parameters to be used if type is KINESIS:

Parameter Name
Description

kinesis.iam_role_arn

AWS IAM role ARN to use when authenticating with an Amazon Kinesis service.

Required: Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials. Default value: None Type: String Example: arn:aws:iam::123456789012:role/example-IAM-role

kinesis.access_key_id

AWS IAM access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String

kinesis.secret_access_key

AWS IAM secret access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String

S3-Specific Parameters

Parameters to be used if type is S3:

Parameter Name
Description

aws.iam_role_arn

AWS IAM role ARN to use when authenticating with S3. Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified.

Default value: None Type: String

aws.iam_external_id

IAM External ID. Required: Yes, if aws.iam_role_arn is specified.

Default value: None Type: String

aws.access_key_id

AWS IAM role ARN to use when authenticating with S3. Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified.

Default value: None Type: String

aws.secret_access_key

IAM External ID. Required: No. If updating aws.access_key_id or aws.secret_access_key, both must be specified.

Default value: None Type: String

Snowflake-Specific Parameters

Snowflake stores don't support this command.

Examples

Attach a schema registry to a store

The following example updates the store named "demostore" to attach a schema registry named "ConfluentCloudSR".

demodb.public/demostore# UPDATE STORE
    demostore
WITH (
    'schema_registry.name' = "ConfluentCloudSR"
);
+------------+------------+------------+------------------------------------------+
|  Type      |  Name      |  Command   |  Summary                                 |
+============+============+============+==========================================+
| store      | demostore  | UPDATE     | store "demostore" was successfully       |
|            |            |            | updated                                  |
+------------+------------+------------+------------------------------------------+

Required: No Type: String Valid values: See

Name of a schema registry to associate with the store. A schema registry must first be created using the DDL statement. Only one schema registry can be associated with a store. Required: No Default value: None Type: String Valid values: See

store
store parameters
LIST REGIONS
CREATE SCHEMA_REGISTRY
LIST SCHEMA_REGISTRIES