UPDATE STORE

Syntax

UPDATE STORE
    store_name
WITH (store_parameter = value [, ...]);

Description

Updates a Store with new Store parameters.

Arguments

store_name

Name of the Store to update. For case-sensitive names, the name must be wrapped in double quotes, otherwise, the lowercased name will be used.

WITH (store_parameter = value [, …​ ])

This clause specifies Store parameters; see Store Parameters below for more information.

Store Parameters

Parameter NameDescription

type

Specifies the Store type.

Required: No

Type: STORE_TYPE

Valid values: KAFKA or KINESIS.

access_region

Specifies the region of the Store. In order to improve latency and reduce data transfer costs, the region should be the same cloud and region that the physical Store is running in.

uris

List of comma separated host:port URIs to connect to the store.

Required: No Type: String

tls.disabled

Specifies if the store should be accessed over TLS. Required: No Default value: TRUE

Type: Boolean

Valid values: TRUE or FALSE

tls.verify_server_hostname

Specifies if the server CNAME should be validated against the certificate. Required: No Default value: TRUE Type: Boolean Valid values: TRUE or FALSE

tls.ca_cert_file

Path to a CA certificate file in PEM format. Required: No Default value: Public CA chains. Type: String

tls.cipher_suites

Comma separated list of cipher suites to use when establishing a TLS connection. Required: No Default value: [] Type: List

tls.protocols

Comma separated list TLS protocol versions to use while establishing a TLS connection. Required: No Default value: TLSv1.2,TLSv1.1,TLSv1 Type: List Valid values: TLS protocols with version

schema_registry.name

properties.file

The file path to a yaml file containing other Store parameters. Required: No Default value: None Type: String Valid values: File path in current user's filesystem

Kafka Specific Parameters

Parameters to be used if type is KAFKA:

Parameter NameDescription

kafka.sasl.hash_function

SASL hash function to use when authenticating with Apache Kafka brokers. Required: No Default value: NONE. Type: HASH_FUNCTION Valid values: NONE, PLAIN, SHA256, and SHA512

kafka.sasl.username

Username to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String

kafka.sasl.password

Password to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String

kafka.msk.aws_region

AWS region to use when authenticating with MSK.

Required: Yes, if kafka.sasl.hash_function is AWS_MSK_IAM Default value: None Type: String Example: us-east-1

kafka.msk.iam_role_arn

AWS IAM role ARN to use when authenticating with MSK.

Required: Yes, if kafka.sasl.hash_function is AWS_MSK_IAM Default value: None Type: String Example: arn:aws:iam::123456789012:role/example-IAM-role

tls.client.cert_file

Path to a client certificate file in PEM format. Required: No Default value: None Type: String

tls.client.key_file

Path to the client key file in PEM format. Required: No Default value: None Type: String

Kinesis Specific Parameters

Parameters to be used if type is KINESIS:

Parameter NameDescription

kinesis.iam_role_arn

AWS IAM role ARN to use when authenticating with an Amazon Kinesis service.

Required: Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials Default value: None Type: String Example: arn:aws:iam::123456789012:role/example-IAM-role

kinesis.access_key_id

AWS IAM access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String

kinesis.secret_access_key

AWS IAM secret access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String

Snowflake Specific Parameters

Snowflake Stores don't support this command.

Databricks Specific Parameters

Parameters to be used if type is DATABRICKS:

Parameter NameDescription

databricks.app_token

Databricks personal access token used when authenticating with a Databricks workspace. Required: No Default value: None Type: String

aws.access_key_id

AWS access key ID used for writing data to S3. Required: Yes

Default value: None Type: String

aws.secret_access_key

AWS secret access key used for writing data to S3.

Required: Yes Default value: None Type: String

Examples

Attach a Schema Registry to a Store

The following example updates the store named "demostore" to attach a Schema Registry named "ConfluentCloudSR".

demodb.public/demostore# UPDATE STORE
    demostore
WITH (
    'schema_registry.name' = "ConfluentCloudSR"
);
+------------+------------+------------+------------------------------------------+
|  Type      |  Name      |  Command   |  Summary                                 |
+============+============+============+==========================================+
| store      | demostore  | UPDATE     | store "demostore" was successfully       |
|            |            |            | updated                                  |
+------------+------------+------------+------------------------------------------+

Last updated