Comment on page
UPDATE STORE
UPDATE STORE
store_name
WITH (store_parameter = value [, ...]);
Name of the Store to update. For case-sensitive names, the name must be wrapped in double quotes, otherwise, the lowercased name will be used.
Parameter Name | Description |
---|---|
type | Specifies the Store type. Required: No Type: STORE_TYPE Valid values: KAFKA or KINESIS . |
access_region | Specifies the region of the Store. In order to improve latency and reduce data transfer costs, the region should be the same cloud and region that the physical Store is running in.
|
uris | List of comma separated host:port URIs to connect to the store.
Required: No
Type: String |
tls.disabled | Specifies if the store should be accessed over TLS.
Required: No
Default value: TRUE Type: Boolean Valid values: TRUE or FALSE |
tls.verify_server_hostname | Specifies if the server CNAME should be validated against the certificate.
Required: No
Default value: TRUE
Type: Boolean
Valid values: TRUE or FALSE |
tls.ca_cert_file | Path to a CA certificate file in PEM format.
Required: No
Default value: Public CA chains.
Type: String |
tls.cipher_suites | Comma separated list of cipher suites to use when establishing a TLS connection.
Required: No
Default value: []
Type: List |
tls.protocols | Comma separated list TLS protocol versions to use while establishing a TLS connection.
Required: No
Default value: TLSv1.2,TLSv1.1,TLSv1
Type: List
Valid values: TLS protocols with version |
schema_registry.name | Name of a Schema Registry to associate with the Store. A Schema Registry must first be created using the CREATE SCHEMA_REGISTRY DDL statement. Only one Schema Registry can be associated with a Store.
Required: No
Default value: None
Type: String
Valid values: See LIST SCHEMA_REGISTRIES |
properties.file | The file path to a yaml file containing other Store parameters.
Required: No
Default value: None
Type: String
Valid values: File path in current user's filesystem |
Parameters to be used if
type
is KAFKA
:Parameter Name | Description |
---|---|
kafka.sasl.hash_function | SASL hash function to use when authenticating with Apache Kafka brokers.
Required: No
Default value: NONE .
Type: HASH_FUNCTION
Valid values: NONE , PLAIN , SHA256 , and SHA512 |
kafka.sasl.username | Username to use when authenticating with Apache Kafka brokers.
Required: No
Default value: None
Type: String |
kafka.sasl.password | Password to use when authenticating with Apache Kafka brokers.
Required: No
Default value: None
Type: String |
tls.client.cert_file | Path to a client certificate file in PEM format.
Required: No
Default value: None
Type: String |
tls.client.key_file | Path to the client key file in PEM format.
Required: No
Default value: None
Type: String |
Parameters to be used if
type
is KINESIS
:Parameter Name | Description |
---|---|
kinesis.access_key_id | AWS IAM access key to use when authenticating with an Amazon Kinesis service.
Required: No
Default value: None
Type: String |
kinesis.secret_access_key | AWS IAM secret access key to use when authenticating with an Amazon Kinesis service.
Required: No
Default value: None
Type: String |
Snowflake Stores don't support this command.
Parameters to be used if
type
is DATABRICKS
:Parameter Name | Description |
---|---|
databricks.app_token | Databricks personal access token used when authenticating with a Databricks workspace.
Required: No
Default value: None
Type: String |
aws.access_key_id | AWS access key ID used for writing data to S3.
Required: Yes Default value: None
Type: String |
aws.secret_access_key | AWS secret access key used for writing data to S3.
Required: Yes
Default value: None
Type: String |
The following example updates the store named "kafka_store" to attach a Schema Registry named "SR".
UPDATE STORE
kafka_store
WITH (
'schema_registry.name' = 'SR'
);
Last modified 1mo ago