CREATE STREAM
Syntax
Description
A Stream is a sequence of immutable, partitioned, and partially ordered events (we use events and records synonymously). A Stream is a relational representation of data in streaming Store, such as the data in a Kafka topic or a Kinesis stream. The records in a Stream are independent of each other, meaning there is no correlation between two records in a Stream. A Stream declares the schema of the records, which includes the column name along with the column type and optional constraints. A Stream is a type of Relation. Each Relation belongs to a Schema in a Database, so the fully qualified name of the Relation would be <database>.<schema>.<relation>
.
Arguments
stream_name
Specifies the name of the new Stream. For case-sensitive names, the name must be wrapped in double quotes, otherwise, the lowercased name will be used.
column_name
The name of a column to be created in the new Stream. For case-sensitive names, the name must be wrapped in double quotes, otherwise, the lowercased name will be used.
data_type
The data type of the column. This can include array specifiers. For more information on the data types supported by DeltaStream, refer to the Data Types reference.
NOT NULL
Defines a constraint on the column, ensuring it cannot contain NULL
values.
WITH (stream_parameter = value [, … ])
Optionally, this clause specifies Stream Parameters.
Stream Parameters
Parameter Name | Description |
---|---|
| Name of the Entity that has the data for this Stream. If the Entity doesn't exist, an Entity with this name is created in the corresponding |
| Name of the store that hosts the Entity for this Stream. Required: No Default value: Current session's store name Type: String Valid values: See LIST STORES. |
| Format of message value in the Entity. See Data Formats (Serialization) for more information regarding serialization formats.
Required: Yes
Type: String
Valid values: |
| Name of the column in the Stream to use as the timestamp. If not set, the timestamp of the message is used for time based operations such as window aggregations and joins. If the type of this timestamp field is |
| The format to use for Required: No
Default value: |
Kafka Specific Parameters
Parameters to be used if the associated Store is type KAFKA
:
Parameter Name | Description |
---|---|
| The number of partitions to use when creating the Entity, if applicable. If the topic already exist, then this value must be equal to the number of partitions in the existing Kafkan Entity. Required: Yes, unless topic already exists Default value: Leftmost source Relation topic's partition count Type: Integer Valid values: [1, ...] |
| The number of replicas to use when creating the topic, if applicable. If the topic already exists, then this value must be equal to the number of replicas in the existing Kafkan Entity. Required: Yes, unless topic already exists Default values: Leftmost source Relation topic's replica count Type: Integer Valid values: [1, ...] |
| Format of message key in the Entity. This value can be the same as or different from the |
| Required: No, unless |
| The fault tolerance guarantees applied when producing to this Stream. Required: No
Default value:
|
Kinesis Specific Parameters
Parameters to be used if the associated Store is type KINESIS
:
Parameter Name | Description |
---|---|
| The number of shards to use when creating the topic, if applicable. If the topic already exists, then this value must be equal to the number of shards in the existing Kinesis Stream. Required: Yes, unless topic already exists
Default values: Leftmost source Relation topic's shard count
Type: Integer
Valid values: [1, ...]
Alias: |
Kinesis stores provide a delivery guarantee of at_least_once
when producing events into a sink Entity.
Format Specific Parameters
Avro
Parameters to be used when writing records into a Stream if associated key.format
or value.format
is avro
and the default Avro schema generation needs to be changed using a base schema for the key and/or value.
When generating an Avro schema for a column using a base schema:
if the base schema has a field with the same name and data type as the column's, then the field's definition from the base is used in the generated schema. This includes retaining base schema's
doc
andlogicalType
for the field.if the base schema has a field with the same name as the column's, but a different data type, then an Avro schema type definition is generated from the column's data type with the field's
doc
taken from the its corresponding field in the base schema.
Currently supported Schema Registries are Confluent Cloud and Confluent Platform.
Known Limitation: Confluent Schema Registry must use the default TopicNameStrategy for creating subject names.
Check CREATE SCHEMA_REGISTRY for more details.
Parameter Name | Description |
---|---|
| Name of the Store whose Schema Registry contains the Avro schema subject(s) to be used as the base schema for generating Avro schema for Stream's key and/or value. Required: No Default values: Current session's store name Type: Identifier Valid values: See LIST STORES. |
| Name of the subject in the Schema Registry to obtain the base schema for generating Avro schema for Stream's key.
Required: No, unless |
| Name of the subject in the Schema Registry to obtain the base schema for generating Avro schema for Stream's value columns.
Required: No, unless |
Examples
Create a new Stream with timestamp column and key/value formats
The following creates a new Stream with name pageviews_json
. This Stream reads from an existing topic named pageviews
in the default store demostore
, and has a value.format
of JSON
. Additionally in the WITH
clause, we specify that this Stream has a key of type VARCHAR
and uses the viewtime
column as its timestamp
:
Create a new Stream in a specific Store
The following creates a new Stream pv_kinesis
. This Stream reads from an existing topic named pageviews
in the store kinesis_store
:
Create a new Stream without an existing Entity
The following creates a new Stream visit_count
, and since its corresponding topic doesn't exist in the store kinesis_store
, it requires an additional topic parameter, i.e. topic.shards
to create the new Kinesis Data Stream pv_count
in the store:
Create a new Stream for an existing Entity
The following creates a new users
Stream for the existing users
Entity in the current Store. This DDL implies that the name of the Stream should be used as the name of the Entity that hosts the records. This DDL also implies the original structure for the users
Entity:
Create a new Stream with case-sensitive columns
The following creates a new Stream, CaseSensitivePV
in the Database DataBase
and Schema Schema2
. This Stream reads from a topic named case_sensitive_pageviews
in store OtherStore
and has a value.format
of AVRO and key.format
of PROTOBUF. Since the key.format
is included, it is required that key.type
is also provided and the value in this example is STRUCT<pageid VARCHAR>
. Note that many of the columns are in quotes, indicating they are case-sensitive. The case insensitive column named CaseInsensitiveCol
will be lowercase as caseinsensitivecol
when the Relation is created. In the parameters, the timestamp
for this Relation is also specified so queries processing data using this Relation as the source will refer to the timestamp
column ViewTime
as the event's timestamp.
Create a new Stream with `NOT NULL` column
The following creates a new Stream, users
. Two columns in this Stream are defined with the NOT NULL
constraint: registertime
and contactinfo
. This means in any valid record from this Stream, these two columns are not allowed to contain null values.
Create a new Stream with format specific properties for Avro
The following creates a new Stream, usersInfo,
whose records' key and value are in avro
format. It uses subjects from a Store called sr_store
as the base Avro schema to generate Avro schema for usersInfo
's key and value. users_data-key
subject is used to generate key's Avro schema and users_data-value
subject is used to generate value's Avro schema for the records written into usersInfo.
Last updated