UPDATE STORE
Last updated
Last updated
Updates a with new store parameters.
Name of the store to update. If the name is case sensitive you must wrap it in double quotes; otherwise the system uses the lower case name.
This clause specifies store parameters; see below for more information.
type
Specifies the store type.
Required: No
Type: STORE_TYPE
Valid values: KAFKA
or KINESIS
.
access_region
Specifies the region of the store. To improve latency and reduce data transfer costs, the region should be the same cloud and region in which the physical store is running.
uris
List of comma-separated host:port
URIs to connect to the store.
Required: No Type: String
tls.disabled
Specifies if the store should be accessed over TLS.
Required: No
Default value: FALSE
Type: Boolean
Valid values: TRUE
or FALSE
tls.verify_server_hostname
Specifies if the server CNAME should be validated against the certificate.
Required: No
Default value: TRUE
Type: Boolean
Valid values: TRUE
or FALSE
tls.ca_cert_file
Path to a CA certificate file in PEM format. Required: No Default value: Public CA chains. Type: String
tls.cipher_suites
Comma-separated list of cipher suites to use when establishing a TLS connection. Required: No Default value: [] Type: List
tls.protocols
Comma-separated list TLS protocol versions to use while establishing a TLS connection.
Required: No
Default value: TLSv1.2,TLSv1.1,TLSv1
Type: List
Valid values: TLS protocols with version
schema_registry.name
properties.file
The file path to a .yaml file containing other store parameters. Required: No Default value: None Type: String Valid values: File path in current user's filesystem
Parameters to be used if type
is CLICKHOUSE
:
clickhouse.username
Username to connect to the database instance specified with the store's uris
parameter.
Required: Yes
Default value: None
Type: String
clickhouse.password
Password to connect to the database instance using the store's username
parameter.
Required: Yes
Default value: None Type: String
Parameters to be used if type
is DATABRICKS
:
databricks.app_token
Databricks personal access token used when authenticating with a Databricks workspace. Required: No Default value: None Type: String
aws.access_key_id
AWS access key ID used for writing data to S3. Required: Yes
Default value: None Type: String
aws.secret_access_key
AWS secret access key used for writing data to S3.
Required: Yes Default value: None Type: String
aws.iam_role_arn
AWS IAM role ARN to use when authenticating with S3
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: NONE
Type: STRING
aws.iam_external_id
IAM External ID
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: NONE
Type: STRING
aws.access_key_id
AWS IAM role ARN to use when authenticating with S3.
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: NONE
Type: STRING
aws.secret_access_key
IAM External ID
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: NONE
Type: STRING
aws.region
AWS region in which the glue catalog resides. Required: No
Default value: NONE
Type: STRING
iceberg.warehouse.default_path
Iceberg default warehouse path. Required: No
Default value: NONE
Type: STRING
iceberg.catalog.id
Iceberg catalog ID Required: No
Default value: NONE
Type: STRING
Parameters to be used if type
is KAFKA
:
kafka.sasl.hash_function
SASL hash function to use when authenticating with Apache Kafka brokers.
Required: No
Default value: NONE
.
Type: HASH_FUNCTION
Valid values: NONE
, PLAIN
, SHA256
, and SHA512
kafka.sasl.username
Username to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String
kafka.sasl.password
Password to use when authenticating with Apache Kafka brokers. Required: No Default value: None Type: String
kafka.msk.aws_region
AWS region to use when authenticating with MSK.
Required: Yes, if kafka.sasl.hash_function
is AWS_MSK_IAM
Default value: None
Type: String
Example: us-east-1
kafka.msk.iam_role_arn
AWS IAM role ARN to use when authenticating with MSK.
Required: Yes, if kafka.sasl.hash_function
is AWS_MSK_IAM
Default value: None
Type: String
Example: arn:aws:iam::123456789012:role/example-IAM-role
tls.client.cert_file
Path to a client certificate file in PEM format. Required: No Default value: None Type: String
tls.client.key_file
Path to the client key file in PEM format. Required: No Default value: None Type: String
Parameters to be used if type
is KINESIS
:
kinesis.iam_role_arn
AWS IAM role ARN to use when authenticating with an Amazon Kinesis service.
Required: Yes, unless authenticating with the Amazon Kinesis Service using static AWS credentials.
Default value: None
Type: String
Example: arn:aws:iam::123456789012:role/example-IAM-role
kinesis.access_key_id
AWS IAM access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String
kinesis.secret_access_key
AWS IAM secret access key to use when authenticating with an Amazon Kinesis service. Required: No Default value: None Type: String
Parameters to be used if type
is S3
:
aws.iam_role_arn
AWS IAM role ARN to use when authenticating with S3.
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: None Type: String
aws.iam_external_id
IAM External ID.
Required: Yes, if aws.iam_role_arn
is specified.
Default value: None Type: String
aws.access_key_id
AWS IAM role ARN to use when authenticating with S3.
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: None Type: String
aws.secret_access_key
IAM External ID.
Required: No. If updating aws.access_key_id
or aws.secret_access_key
, both must be specified.
Default value: None Type: String
Snowflake stores don't support this command.
The following example updates the store named "demostore" to attach a schema registry named "ConfluentCloudSR".
Required: No Type: String Valid values: See
Name of a schema registry to associate with the store. A schema registry must first be created using the DDL statement. Only one schema registry can be associated with a store. Required: No Default value: None Type: String Valid values: See