UPDATE STORE
Syntax
Description
Updates a store with new store parameters.
Arguments
store_name
Name of the store to update. If the name is case sensitive you must wrap it in double quotes; otherwise the system uses the lower case name.
WITH (store_parameter = value [, … ])
This clause specifies store parameters; see store parameters below for more information.
Store Parameters
type
Specifies the store type.
Required: No
Type: STORE_TYPE
Valid values: KAFKA
or KINESIS
.
access_region
Specifies the region of the store. To improve latency and reduce data transfer costs, the region should be the same cloud and region in which the physical store is running.
Required: No Type: String Valid values: See LIST REGIONS
uris
List of comma-separated host:port
URIs to connect to the store.
Required: No Type: String
tls.disabled
Specifies if the store should be accessed over TLS.
Required: No
Default value: TRUE
Type: Boolean
Valid values: TRUE
or FALSE
tls.verify_server_hostname
Specifies if the server CNAME should be validated against the certificate.
Required: No
Default value: TRUE
Type: Boolean
Valid values: TRUE
or FALSE
tls.ca_cert_file
Path to a CA certificate file in PEM format. Required: No Default value: Public CA chains. Type: String
tls.cipher_suites
Comma-separated list of cipher suites to use when establishing a TLS connection. Required: No Default value: [] Type: List
tls.protocols
Comma-separated list TLS protocol versions to use while establishing a TLS connection.
Required: No
Default value: TLSv1.2,TLSv1.1,TLSv1
Type: List
Valid values: TLS protocols with version
schema_registry.name
Name of a schema registry to associate with the store. A schema registry must first be created using the CREATE SCHEMA_REGISTRY DDL statement. Only one schema registry can be associated with a store. Required: No Default value: None Type: String Valid values: See LIST SCHEMA_REGISTRIES
properties.file
The file path to a .yaml file containing other store parameters. Required: No Default value: None Type: String Valid values: File path in current user's filesystem
Kafka Specific Parameters
Parameters to be used if type
is KAFKA
:
kafka.sasl.hash_function
SASL hash function to use when authenticating with Apache Kafka brokers.
Required: No
Default value: NONE
.
Type: HASH_FUNCTION
Valid values: NONE
, PLAIN
, SHA256
, and SHA512
kafka.sasl.username
Username to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String
kafka.sasl.password
Password to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String
kafka.msk.aws_region
AWS region to use when authenticating with MSK.
Required: Yes, if kafka.sasl.hash_function
is AWS_MSK_IAM
Default value: None
Type: String
Example: us-east-1
kafka.msk.iam_role_arn
AWS IAM role ARN to use when authenticating with MSK.
Required: Yes, if kafka.sasl.hash_function
is AWS_MSK_IAM
Default value: None
Type: String
Example: arn:aws:iam::123456789012:role/example-IAM-role
tls.client.cert_file
Path to a client certificate file in PEM format. Required: No Default value: None Type: String
tls.client.key_file
Path to the client key file in PEM format. Required: No Default value: None Type: String
Kinesis Specific Parameters
Parameters to be used if type
is KINESIS
:
kinesis.iam_role_arn
AWS IAM role ARN to use when authenticating with an Amazon Kinesis service.
Required: Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials.
Default value: None
Type: String
Example: arn:aws:iam::123456789012:role/example-IAM-role
kinesis.access_key_id
AWS IAM access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String
kinesis.secret_access_key
AWS IAM secret access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String
Snowflake-Specific Parameters
Snowflake stores don't support this command.
Databricks-Specific Parameters
Parameters to be used if type
is DATABRICKS
:
databricks.app_token
Databricks personal access token used when authenticating with a Databricks workspace. Required: No Default value: None Type: String
aws.access_key_id
AWS access key ID used for writing data to S3. Required: Yes
Default value: None Type: String
aws.secret_access_key
AWS secret access key used for writing data to S3.
Required: Yes Default value: None Type: String
Examples
Attach a schema registry to a store
The following example updates the store named "demostore" to attach a schema registry named "ConfluentCloudSR".
Last updated