LogoLogo
Start Trial
  • Overview
    • What is DeltaStream?
    • Core Concepts
      • Access Control
      • Compute Pools
      • Data Store
      • Database
      • Function
      • Query
      • SQL
      • Visualizing Data Lineage
  • Getting Started
    • Free Trial Quick Start
    • Starting with the Web App
    • Starting with the CLI
  • How do I...?
    • Create and Manage Data Stores
      • Create Data Stores for Streaming Data
      • Explore Data Store and Topic Details
      • Use Multiple Data Stores in Queries
    • Manage Users and User Roles
      • Inviting Users to an Organization
      • Administering Users in your Organization
      • Using the CLI to Manage User Roles
      • Example: Setting Up Custom Roles for Production and Stage
    • Create DeltaStream Objects to Structure Raw Data
    • Use Namespacing for Organizing Data
    • Create and Query Materialized Views
    • Create a Compute Pool to Work with Iceberg
    • Create a Function
    • Secure my Connection to a Data Store
      • Introducing DeltaStream Private Links
      • Creating an AWS Private Link from DeltaStream to your Confluent Kafka Dedicated Cluster
      • Enabling Private Link Connectivity to Confluent Enterprise Cluster and Schema Registry
      • Creating a Private Link from DeltaStream to Amazon MSK
      • Creating a Private Link for RDS Databases
      • Deleting a Private Link
    • Serialize my Data
      • Working with ProtoBuf Serialized Data and DeltaStream Descriptors
      • Working with Avro Serialized Data and Schema Registries
      • Configuring Deserialization Error Handling
  • Integrations
    • Setting up Data Store Integrations
      • AWS S3
      • ClickHouse
      • Confluent Cloud
      • Databricks
      • Iceberg REST Catalog
      • PostgreSQL
      • Snowflake
      • WarpStream
  • Setting up Enterprise Security Integrations
    • Okta SAML Integration
    • Okta SCIM Integration
  • use cases
    • Using an AWS S3 Store as a Source to Feed an MSK Topic
  • Reference
    • Metrics
      • Prometheus Integration
      • Built-In Metrics
      • Custom Metrics in Functions
    • SQL Syntax
      • Data Formats (Serialization)
        • Serializing with JSON
        • Serializing with Primitive Data Types
        • Serializing with Protobuf
      • Data Types
      • Identifiers and Keywords
      • Command
        • ACCEPT INVITATION
        • CAN I
        • COPY DESCRIPTOR_SOURCE
        • COPY FUNCTION_SOURCE
        • DESCRIBE ENTITY
        • DESCRIBE QUERY
        • DESCRIBE QUERY METRICS
        • DESCRIBE QUERY EVENTS
        • DESCRIBE QUERY STATE
        • DESCRIBE RELATION
        • DESCRIBE RELATION COLUMNS
        • DESCRIBE ROLE
        • DESCRIBE SECURITY INTEGRATION
        • DESCRIBE <statement>
        • DESCRIBE STORE
        • DESCRIBE USER
        • GENERATE COLUMNS
        • GENERATE TEMPLATE
        • GRANT OWNERSHIP
        • GRANT PRIVILEGES
        • GRANT ROLE
        • INVITE USER
        • LIST API_TOKENS
        • LIST COMPUTE_POOLS
        • LIST DATABASES
        • LIST DESCRIPTORS
        • LIST DESCRIPTOR_SOURCES
        • LIST ENTITIES
        • LIST FUNCTIONS
        • LIST FUNCTION_SOURCES
        • LIST INVITATIONS
        • LIST METRICS INTEGRATIONS
        • LIST ORGANIZATIONS
        • LIST QUERIES
        • LIST RELATIONS
        • LIST ROLES
        • LIST SCHEMAS
        • LIST SCHEMA_REGISTRIES
        • LIST SECRETS
        • LIST SECURITY INTEGRATIONS
        • LIST STORES
        • LIST USERS
        • PRINT ENTITY
        • REJECT INVITATION
        • REVOKE INVITATION
        • REVOKE PRIVILEGES
        • REVOKE ROLE
        • SET DEFAULT
        • USE
        • START COMPUTE_POOL
        • STOP COMPUTE_POOL
      • DDL
        • ALTER API_TOKEN
        • ALTER SECURITY INTEGRATION
        • CREATE API_TOKEN
        • CREATE CHANGELOG
        • CREATE COMPUTE_POOL
        • CREATE DATABASE
        • CREATE DESCRIPTOR_SOURCE
        • CREATE ENTITY
        • CREATE FUNCTION_SOURCE
        • CREATE FUNCTION
        • CREATE INDEX
        • CREATE METRICS INTEGRATION
        • CREATE ORGANIZATION
        • CREATE ROLE
        • CREATE SCHEMA_REGISTRY
        • CREATE SCHEMA
        • CREATE SECRET
        • CREATE SECURITY INTEGRATION
        • CREATE STORE
        • CREATE STREAM
        • CREATE TABLE
        • DROP API_TOKEN
        • DROP CHANGELOG
        • DROP COMPUTE_POOL
        • DROP DATABASE
        • DROP DESCRIPTOR_SOURCE
        • DROP ENTITY
        • DROP FUNCTION_SOURCE
        • DROP FUNCTION
        • DROP METRICS INTEGRATION
        • DROP RELATION
        • DROP ROLE
        • DROP SCHEMA
        • DROP SCHEMA_REGISTRY
        • DROP SECRET
        • DROP SECURITY INTEGRATION
        • DROP STORE
        • DROP STREAM
        • DROP USER
        • START/STOP COMPUTE_POOL
        • UPDATE COMPUTE_POOL
        • UPDATE ENTITY
        • UPDATE SCHEMA_REGISTRY
        • UPDATE SECRET
        • UPDATE STORE
      • Query
        • APPLICATION
        • Change Data Capture (CDC)
        • CREATE CHANGELOG AS SELECT
        • CREATE STREAM AS SELECT
        • CREATE TABLE AS SELECT
        • Function
          • Built-in Functions
          • Row Metadata Functions
        • INSERT INTO
        • Materialized View
          • CREATE MATERIALIZED VIEW AS
          • SELECT (FROM MATERIALIZED VIEW)
        • Query Name and Version
        • Resume Query
        • RESTART QUERY
        • SELECT
          • FROM
          • JOIN
          • MATCH_RECOGNIZE
          • WITH (Common Table Expression)
        • TERMINATE QUERY
      • Sandbox
        • START SANDBOX
        • DESCRIBE SANDBOX
        • STOP SANDBOX
      • Row Key Definition
    • DeltaStream OpenAPI
      • Deltastream
      • Models
Powered by GitBook
On this page
  • Syntax
  • Description
  • Arguments
  • Table Parameters
  • Snowflake-Specific Parameters
  • Databricks-Specific Parameters
  • Postgres-Specific Parameters
  • Examples - Snowflake
  • Examples — PostgreSQL
  1. Reference
  2. SQL Syntax
  3. Query

CREATE TABLE AS SELECT

PreviousCREATE STREAM AS SELECTNextFunction

Last updated 3 days ago

Syntax

CREATE TABLE
    table_name
[WITH (table_parameter = value [, ... ])] 
AS select_statement;

Description

CREATE TABLE AS is a statement that:

  • Generates a DDL statement to create a new .

  • Launches a new query to write the results of the SELECT statement into the newly created table.

Arguments

table_name

This specifies the name of the new table. Optionally, use <database_name>.<schema_name> as the prefix to the name to create the relation in that namespace. If the name is case sensitive you must wrap it in double quotes; otherwise the system uses the lower case name.

WITH (<table_parameter> = <value> [, …​ ])

Optionally, this clause specifies Table Parameters.

select_statement

This statement specifies the SELECT statement to run.

Table Parameters

Parameter Name
Description

store

The name of the store that hosts the entity for this stream.

Snowflake-Specific Parameters

Parameter Name
Description

snowflake.db.name

snowflake.schema.name

snowflake.table.name

snowflake.buffer.millis

It is important to note that the longer events are buffered, the larger the internal state of the query gets. Depending on the volume and size of the events for the corresponding query, memory limitations may be reached. Required: No Default value: 1000 Type: Long Valid values: (0, ...]

Databricks-Specific Parameters

Parameter Name
Description

databricks.catalog.name

databricks.schema.name

databricks.table.name

table.data.file.location

Postgres-Specific Parameters

Parameter Name
Description

postgresql.db.name

Required: Yes

postgresql.schema.name

Required: Yes

postgresql.table.name

Required: No

Examples - Snowflake

Create a copy of a stream in a Snowflake table

CREATE TABLE "PV_TABLE" WITH (
  'store' = 'sfstore',
  'snowflake.db.name' = 'DELTA_STREAMING',
  'snowflake.schema.name' = 'PUBLIC'
) AS SELECT * FROM pageviews;

Create a stream of changes for a changelog in a Snowflake table

CREATE TABLE "CC_TYPE_USAGE" WITH (
  'store' = 'sfstore',
  'snowflake.db.name' = 'DELTA_STREAMING',
  'snowflake.schema.name' = 'PUBLIC'
) AS SELECT
  cc_type AS "CREDIT_TYPE",
  tx_time AS "TRANSACTION_TS",
  tx_id AS "TRANSACTION_ID"
FROM transactions
GROUP BY cc_type;

This query stores all changes to the grouping column cc_type to the sink table CC_TYPE_USAGE.

Examples — PostgreSQL

Assumptions for the following 2 examples:

  • pageviews is a stream you have already defined.

  • ps_store is a PostgreSQL data store you have already created.

Create a new table in a PostgresSQL data store — example 1

In this example, you create a new table, named pageviews, in the given PostgreSQL data store under the public schema. The table has 3 columns:

  1. viewtime

  2. uid

  3. pageid.

The query writes its results into this table as its sink.

CREATE TABLE pg_table WITH (
   'store'='ps_store', 
   'postgresql.db.name'='demo', 
   'postgresql.schema.name'='public', 
   'postgresql.table.name'='pageviews') AS 
SELECT viewtime, userid AS uid, pageid 
FROM pageviews;

Create a new table in a PostgresSQL data store — example 2

This query creates a new table, named pagevisits, in the given PostgreSQL data store under the public schema. The table has 2 columns:

  1. userid

  2. cnt

The query writes its results into this table.

CREATE TABLE pagevisits WITH (
       'store'='ps_store', 
       'postgresql.db.name'='demo', 
       'postgresql.schema.name'='public') AS 
SELECT userid, count(*) AS cnt 
FROM pageviews 
GROUP BY userid;

Note Since the query includes a GROUP BY clause, the userid column — which is the grouping column — is considered the primary key for the results. The pagevisits table in PostgreSQL is created accordingly, with userid as its primary column.

Create a copy of a stream in a Databricks table

CREATE TABLE pageviews_db WITH (
  'store' = 'databricks_store', 
  'databricks.catalog.name' = 'catalog1', 
  'databricks.schema.name' = 'schema1', 
  'databricks.table.name' = 'pageviews', 
  'table.data.file.location' = 's3://mybucket/test/0/pageviews'
) AS SELECT * FROM pageviews;

Upon issuing this query, a Databricks table is created in catalog1.schema1.pageviews that uses s3://mybucket/test/0/pageviews as its external location. This query writes the Delta-formatted parquet files and updates the Delta log in that S3 location.

Required: No Default value: User’s default .

Type: String Valid values: See

The name of the Snowflake database that would host the Snowflake .

Required: Yes Default values: None Type: String Valid values: Database names available from .

The name of the Snowflake schema that would host the Snowflake .

Required: Yes Default value: None Type: String Valid values: Schema names available from under the snowflake.db.name database name.

The name of the Snowflake table to use when creating the . If the table doesn't exist in the store, a table with the table_name is created in the corresponding store. Required: No Default value: table_name Type: String

The amount of time to buffer events with the sink before writing the data into the corresponding Snowflake table.

Snowflake stores provide a delivery guarantee of at_least_once when producing events into a sink . Snowflake stores provide an insert-only mode when writing to Snowflake tables.

The name of the Databricks catalog that would host the Databricks .

Required: Yes Default values: None Type: String Valid values: Database names available from .

The name of the Databricks schema that would host the Databricks .

Required: Yes Default value: None Type: String Valid values: Schema names available from under the databricks.catalog.name catalog name.

The name of the Databricks table to use when creating the . A table with the table_name is created in the corresponding store. If that table already exists, then an error is reported back to the user. Required: No Default value: table_name Type: String

The S3 directory location for the Delta-formatted data to be written. The credentials for writing to S3 are given during store creation (see ). Note that the S3 bucket from the location specified by this parameter must match the databricks.cloud.s3.bucket property defined in the store. Required: Yes Default value: None Type: String

Databricks stores provide a delivery guarantee of exactly_once when producing events into a sink . Databricks stores provide an insert-only mode when writing to Databricks tables.

See for the same parameter name

See for the same parameter name

See for the same parameter name

The following creates a replica of the source , pageviews in the Snowflake , PV_TABLE:

The following CTAS query creates a new Snowflake to store incremental changes resulting from a grouping aggregation on the transactions :

For details and an example, see .

The following creates a replica of the source , pageviews in the Databricks , pageviews_db:

CREATE STORE
Data Store
LIST STORES
LIST ENTITIES
LIST ENTITIES
LIST ENTITIES
LIST ENTITIES
CREATE STORE
CDC source parameters
CDC source parameters
CDC source parameters
Table
Table
Table
Stream
Table
Table
Stream
Stream
Table
Table
Table
Table
Table
Table
Table
Table