LogoLogo
Start Trial
  • Overview
    • What is DeltaStream?
    • Core Concepts
      • Access Control
      • Region
      • SQL
      • Store
      • Database
      • Query
      • Visualizing Data Lineage
      • Function
  • Getting Started
    • Free Trial Quick Start
    • Starting with the Web App
    • Starting with the CLI
  • Tutorials
    • Managing Users and User Roles
      • Inviting Users to an Organization
      • Administering Users in your Organization
      • Using the CLI to Manage User Roles
      • Example: Setting Up Custom Roles for Production and Stage
    • Creating Stores for Streaming Data
    • Using Multiple Stores in Queries
    • Creating Relations to Structure Raw Data
    • Namespacing with Database and Schema
    • Creating and Querying Materialized Views
    • Creating a Function
    • Securing Your Connections to Data Stores
      • Introducing DeltaStream Private Links
      • Creating an AWS Private Link from DeltaStream to your Confluent Kafka Dedicated Cluster
      • Enabling Private Link Connectivity to Confluent Enterprise Cluster and Schema Registry
      • Creating a Private Link from DeltaStream to Amazon MSK
      • Creating a Private Link for RDS Databases
      • Deleting a Private Link
    • Integrations
      • Connecting to Confluent Cloud
      • Databricks
      • PostgreSQL
      • Snowflake
      • WarpStream
    • Serialization
      • Working with ProtoBuf Serialized Data and DeltaStream Descriptors
      • Working with Avro Serialized Data and Schema Registries
      • Configuring Deserialization Error Handling
  • Reference
    • Enterprise Security Integrations
      • Okta SAML Integration
      • Okta SCIM Integration
    • Metrics
      • Prometheus Integration
      • Built-In Metrics
      • Custom Metrics in Functions
    • SQL Syntax
      • Data Formats (Serialization)
        • Serializing with JSON
        • Serializing with Primitive Data Types
        • Serializing with Protobuf
      • Data Types
      • Identifiers and Keywords
      • Command
        • ACCEPT INVITATION
        • CAN I
        • COPY DESCRIPTOR_SOURCE
        • COPY FUNCTION_SOURCE
        • DESCRIBE ENTITY
        • DESCRIBE QUERY
        • DESCRIBE QUERY METRICS
        • DESCRIBE QUERY EVENTS
        • DESCRIBE QUERY STATE
        • DESCRIBE RELATION
        • DESCRIBE RELATION COLUMNS
        • DESCRIBE ROLE
        • DESCRIBE SECURITY INTEGRATION
        • DESCRIBE <statement>
        • DESCRIBE STORE
        • DESCRIBE USER
        • GENERATE COLUMNS
        • GENERATE TEMPLATE
        • GRANT OWNERSHIP
        • GRANT PRIVILEGES
        • GRANT ROLE
        • INVITE USER
        • LIST API_TOKENS
        • LIST DATABASES
        • LIST DESCRIPTORS
        • LIST DESCRIPTOR_SOURCES
        • LIST ENTITIES
        • LIST FUNCTIONS
        • LIST FUNCTION_SOURCES
        • LIST INVITATIONS
        • LIST METRICS INTEGRATIONS
        • LIST ORGANIZATIONS
        • LIST QUERIES
        • LIST REGIONS
        • LIST RELATIONS
        • LIST ROLES
        • LIST SCHEMAS
        • LIST SCHEMA_REGISTRIES
        • LIST SECRETS
        • LIST SECURITY INTEGRATIONS
        • LIST STORES
        • LIST USERS
        • PRINT ENTITY
        • REJECT INVITATION
        • REVOKE INVITATION
        • REVOKE PRIVILEGES
        • REVOKE ROLE
        • SET DEFAULT
        • USE
      • DDL
        • ALTER API_TOKEN
        • ALTER SECURITY INTEGRATION
        • CREATE API_TOKEN
        • CREATE CHANGELOG
        • CREATE DATABASE
        • CREATE DESCRIPTOR_SOURCE
        • CREATE ENTITY
        • CREATE FUNCTION_SOURCE
        • CREATE FUNCTION
        • CREATE INDEX
        • CREATE METRICS INTEGRATION
        • CREATE ORGANIZATION
        • CREATE ROLE
        • CREATE SCHEMA_REGISTRY
        • CREATE SCHEMA
        • CREATE SECRET
        • CREATE SECURITY INTEGRATION
        • CREATE STORE
        • CREATE STREAM
        • CREATE TABLE
        • DROP API_TOKEN
        • DROP CHANGELOG
        • DROP DATABASE
        • DROP DESCRIPTOR_SOURCE
        • DROP ENTITY
        • DROP FUNCTION_SOURCE
        • DROP FUNCTION
        • DROP METRICS INTEGRATION
        • DROP RELATION
        • DROP ROLE
        • DROP SCHEMA
        • DROP SCHEMA_REGISTRY
        • DROP SECRET
        • DROP SECURITY INTEGRATION
        • DROP STORE
        • DROP STREAM
        • DROP USER
        • UPDATE ENTITY
        • UPDATE SCHEMA_REGISTRY
        • UPDATE SECRET
        • UPDATE STORE
      • Query
        • APPLICATION
        • Change Data Capture (CDC)
        • CREATE CHANGELOG AS SELECT
        • CREATE STREAM AS SELECT
        • CREATE TABLE AS SELECT
        • Function
          • Built-in Functions
          • Row Metadata Functions
        • INSERT INTO
        • Materialized View
          • CREATE MATERIALIZED VIEW AS
          • SELECT (FROM MATERIALIZED VIEW)
        • Query Name and Version
        • Resume Query
        • RESTART QUERY
        • SELECT
          • FROM
          • JOIN
          • MATCH_RECOGNIZE
          • WITH (Common Table Expression)
        • TERMINATE QUERY
      • Sandbox
        • START SANDBOX
        • DESCRIBE SANDBOX
        • STOP SANDBOX
      • Row Key Definition
    • Rest API
Powered by GitBook
On this page
  • Syntax
  • Description
  • Arguments
  • Store Parameters
  • Kafka Specific Parameters
  • Kinesis Specific Parameters
  • Snowflake-Specific Parameters
  • Databricks-Specific Parameters
  • Examples
  1. Reference
  2. SQL Syntax
  3. DDL

UPDATE STORE

PreviousUPDATE SECRETNextQuery

Last updated 5 months ago

Syntax

UPDATE STORE
    store_name
WITH (store_parameter = value [, ...]);

Description

Updates a with new store parameters.

Arguments

store_name

Name of the store to update. If the name is case sensitive you must wrap it in double quotes; otherwise the system uses the lower case name.

WITH (store_parameter = value [, …​ ])

This clause specifies store parameters; see below for more information.

Store Parameters

Parameter Name
Description

type

Specifies the store type.

Required: No

Type: STORE_TYPE

Valid values: KAFKA or KINESIS.

access_region

Specifies the region of the store. To improve latency and reduce data transfer costs, the region should be the same cloud and region in which the physical store is running.

uris

List of comma-separated host:port URIs to connect to the store.

Required: No Type: String

tls.disabled

Specifies if the store should be accessed over TLS. Required: No Default value: TRUE

Type: Boolean

Valid values: TRUE or FALSE

tls.verify_server_hostname

Specifies if the server CNAME should be validated against the certificate. Required: No Default value: TRUE Type: Boolean Valid values: TRUE or FALSE

tls.ca_cert_file

Path to a CA certificate file in PEM format. Required: No Default value: Public CA chains. Type: String

tls.cipher_suites

Comma-separated list of cipher suites to use when establishing a TLS connection. Required: No Default value: [] Type: List

tls.protocols

Comma-separated list TLS protocol versions to use while establishing a TLS connection. Required: No Default value: TLSv1.2,TLSv1.1,TLSv1 Type: List Valid values: TLS protocols with version

schema_registry.name

properties.file

The file path to a .yaml file containing other store parameters. Required: No Default value: None Type: String Valid values: File path in current user's filesystem

Kafka Specific Parameters

Parameters to be used if type is KAFKA:

Parameter Name
Description

kafka.sasl.hash_function

SASL hash function to use when authenticating with Apache Kafka brokers. Required: No Default value: NONE. Type: HASH_FUNCTION Valid values: NONE, PLAIN, SHA256, and SHA512

kafka.sasl.username

Username to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String

kafka.sasl.password

Password to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String

kafka.msk.aws_region

AWS region to use when authenticating with MSK.

Required: Yes, if kafka.sasl.hash_function is AWS_MSK_IAM Default value: None Type: String Example: us-east-1

kafka.msk.iam_role_arn

AWS IAM role ARN to use when authenticating with MSK.

Required: Yes, if kafka.sasl.hash_function is AWS_MSK_IAM Default value: None Type: String Example: arn:aws:iam::123456789012:role/example-IAM-role

tls.client.cert_file

Path to a client certificate file in PEM format. Required: No Default value: None Type: String

tls.client.key_file

Path to the client key file in PEM format. Required: No Default value: None Type: String

Kinesis Specific Parameters

Parameters to be used if type is KINESIS:

Parameter Name
Description

kinesis.iam_role_arn

AWS IAM role ARN to use when authenticating with an Amazon Kinesis service.

Required: Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials. Default value: None Type: String Example: arn:aws:iam::123456789012:role/example-IAM-role

kinesis.access_key_id

AWS IAM access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String

kinesis.secret_access_key

AWS IAM secret access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String

Snowflake-Specific Parameters

Snowflake stores don't support this command.

Databricks-Specific Parameters

Parameters to be used if type is DATABRICKS:

Parameter Name
Description

databricks.app_token

Databricks personal access token used when authenticating with a Databricks workspace. Required: No Default value: None Type: String

aws.access_key_id

AWS access key ID used for writing data to S3. Required: Yes

Default value: None Type: String

aws.secret_access_key

AWS secret access key used for writing data to S3.

Required: Yes Default value: None Type: String

Examples

Attach a schema registry to a store

The following example updates the store named "demostore" to attach a schema registry named "ConfluentCloudSR".

demodb.public/demostore# UPDATE STORE
    demostore
WITH (
    'schema_registry.name' = "ConfluentCloudSR"
);
+------------+------------+------------+------------------------------------------+
|  Type      |  Name      |  Command   |  Summary                                 |
+============+============+============+==========================================+
| store      | demostore  | UPDATE     | store "demostore" was successfully       |
|            |            |            | updated                                  |
+------------+------------+------------+------------------------------------------+

Required: No Type: String Valid values: See

Name of a schema registry to associate with the store. A schema registry must first be created using the DDL statement. Only one schema registry can be associated with a store. Required: No Default value: None Type: String Valid values: See

store
store parameters
LIST REGIONS
CREATE SCHEMA_REGISTRY
LIST SCHEMA_REGISTRIES