LogoLogo
Start Trial
  • Overview
    • What is DeltaStream?
    • Core Concepts
      • Access Control
      • Compute Pools
      • Data Store
      • Database
      • Function
      • Query
      • SQL
      • Visualizing Data Lineage
  • Getting Started
    • Free Trial Quick Start
    • Starting with the Web App
    • Starting with the CLI
  • How do I...?
    • Create and Manage Data Stores
      • Create Data Stores for Streaming Data
      • Explore Data Store and Topic Details
      • Use Multiple Data Stores in Queries
    • Manage Users and User Roles
      • Inviting Users to an Organization
      • Administering Users in your Organization
      • Using the CLI to Manage User Roles
      • Example: Setting Up Custom Roles for Production and Stage
    • Create DeltaStream Objects to Structure Raw Data
    • Use Namespacing for Organizing Data
    • Create and Query Materialized Views
    • Create a Compute Pool to Work with Iceberg
    • Create a Function
    • Secure my Connection to a Data Store
      • Introducing DeltaStream Private Links
      • Creating an AWS Private Link from DeltaStream to your Confluent Kafka Dedicated Cluster
      • Enabling Private Link Connectivity to Confluent Enterprise Cluster and Schema Registry
      • Creating a Private Link from DeltaStream to Amazon MSK
      • Creating a Private Link for RDS Databases
      • Deleting a Private Link
    • Serialize my Data
      • Working with ProtoBuf Serialized Data and DeltaStream Descriptors
      • Working with Avro Serialized Data and Schema Registries
      • Configuring Deserialization Error Handling
  • Integrations
    • Setting up Data Store Integrations
      • AWS S3
      • ClickHouse
      • Confluent Cloud
      • Databricks
      • Iceberg REST Catalog
      • PostgreSQL
      • Snowflake
      • WarpStream
  • Setting up Enterprise Security Integrations
    • Okta SAML Integration
    • Okta SCIM Integration
  • use cases
    • Using an AWS S3 Store as a Source to Feed an MSK Topic
  • Reference
    • Metrics
      • Prometheus Integration
      • Built-In Metrics
      • Custom Metrics in Functions
    • SQL Syntax
      • Data Formats (Serialization)
        • Serializing with JSON
        • Serializing with Primitive Data Types
        • Serializing with Protobuf
      • Data Types
      • Identifiers and Keywords
      • Command
        • ACCEPT INVITATION
        • CAN I
        • COPY DESCRIPTOR_SOURCE
        • COPY FUNCTION_SOURCE
        • DESCRIBE ENTITY
        • DESCRIBE QUERY
        • DESCRIBE QUERY METRICS
        • DESCRIBE QUERY EVENTS
        • DESCRIBE QUERY STATE
        • DESCRIBE RELATION
        • DESCRIBE RELATION COLUMNS
        • DESCRIBE ROLE
        • DESCRIBE SECURITY INTEGRATION
        • DESCRIBE <statement>
        • DESCRIBE STORE
        • DESCRIBE USER
        • GENERATE COLUMNS
        • GENERATE TEMPLATE
        • GRANT OWNERSHIP
        • GRANT PRIVILEGES
        • GRANT ROLE
        • INVITE USER
        • LIST API_TOKENS
        • LIST COMPUTE_POOLS
        • LIST DATABASES
        • LIST DESCRIPTORS
        • LIST DESCRIPTOR_SOURCES
        • LIST ENTITIES
        • LIST FUNCTIONS
        • LIST FUNCTION_SOURCES
        • LIST INVITATIONS
        • LIST METRICS INTEGRATIONS
        • LIST ORGANIZATIONS
        • LIST QUERIES
        • LIST RELATIONS
        • LIST ROLES
        • LIST SCHEMAS
        • LIST SCHEMA_REGISTRIES
        • LIST SECRETS
        • LIST SECURITY INTEGRATIONS
        • LIST STORES
        • LIST USERS
        • PRINT ENTITY
        • REJECT INVITATION
        • REVOKE INVITATION
        • REVOKE PRIVILEGES
        • REVOKE ROLE
        • SET DEFAULT
        • USE
        • START COMPUTE_POOL
        • STOP COMPUTE_POOL
      • DDL
        • ALTER API_TOKEN
        • ALTER SECURITY INTEGRATION
        • CREATE API_TOKEN
        • CREATE CHANGELOG
        • CREATE COMPUTE_POOL
        • CREATE DATABASE
        • CREATE DESCRIPTOR_SOURCE
        • CREATE ENTITY
        • CREATE FUNCTION_SOURCE
        • CREATE FUNCTION
        • CREATE INDEX
        • CREATE METRICS INTEGRATION
        • CREATE ORGANIZATION
        • CREATE ROLE
        • CREATE SCHEMA_REGISTRY
        • CREATE SCHEMA
        • CREATE SECRET
        • CREATE SECURITY INTEGRATION
        • CREATE STORE
        • CREATE STREAM
        • CREATE TABLE
        • DROP API_TOKEN
        • DROP CHANGELOG
        • DROP COMPUTE_POOL
        • DROP DATABASE
        • DROP DESCRIPTOR_SOURCE
        • DROP ENTITY
        • DROP FUNCTION_SOURCE
        • DROP FUNCTION
        • DROP METRICS INTEGRATION
        • DROP RELATION
        • DROP ROLE
        • DROP SCHEMA
        • DROP SCHEMA_REGISTRY
        • DROP SECRET
        • DROP SECURITY INTEGRATION
        • DROP STORE
        • DROP STREAM
        • DROP USER
        • START/STOP COMPUTE_POOL
        • UPDATE COMPUTE_POOL
        • UPDATE ENTITY
        • UPDATE SCHEMA_REGISTRY
        • UPDATE SECRET
        • UPDATE STORE
      • Query
        • APPLICATION
        • Change Data Capture (CDC)
        • CREATE CHANGELOG AS SELECT
        • CREATE STREAM AS SELECT
        • CREATE TABLE AS SELECT
        • Function
          • Built-in Functions
          • Row Metadata Functions
        • INSERT INTO
        • Materialized View
          • CREATE MATERIALIZED VIEW AS
          • SELECT (FROM MATERIALIZED VIEW)
        • Query Name and Version
        • Resume Query
        • RESTART QUERY
        • SELECT
          • FROM
          • JOIN
          • MATCH_RECOGNIZE
          • WITH (Common Table Expression)
        • TERMINATE QUERY
      • Sandbox
        • START SANDBOX
        • DESCRIBE SANDBOX
        • STOP SANDBOX
      • Row Key Definition
    • DeltaStream OpenAPI
      • Deltastream
      • Models
Powered by GitBook
On this page
  • Setup
  • Simple Example
  • Partial Record Example
  • Mismatched Types Example
  • Boolean
  • Character String
  • Numeric
  • Date and Time
  • Binary String
  • Constructed Data Types
  1. Reference
  2. SQL Syntax
  3. Data Formats (Serialization)

Serializing with JSON

PreviousData Formats (Serialization)NextSerializing with Primitive Data Types

Last updated 2 months ago

Setup

The following describes through examples how a DeltaStream query converts JSON payloads to DeltaStream’s Data Types when reading from a or .

The following examples use the stream defined below:

CREATE STREAM jsonExample (
  "booleanValue" BOOLEAN,
  "stringValue" VARCHAR,
  "tinyIntValue" TINYINT,
  "smallIntValue" SMALLINT,
  "intValue" INTEGER,
  "bigIntValue" BIGINT,
  "floatValue" FLOAT,
  "doubleValue" DOUBLE,
  "decimalValue" DECIMAL(4, 3),
  "dateValue" DATE,
  "timeValue" TIME,
  "timestampValue" TIMESTAMP(3),
  "timestampLtzValue" TIMESTAMP_LTZ,
  "bytesValue" VARBINARY,
  "arrayValue" ARRAY<VARCHAR>,
  "mapValue" MAP<VARCHAR, BIGINT>,
  "structValue" STRUCT<col1 BIGINT>
) WITH (
  'topic' = 'jsonExample', 'value.format' = 'JSON'
);

Simple Example

With the query:

SELECT * FROM jsonExample;
// input record
{
  "booleanValue": true,
  "stringValue": "howdy",
  "tinyIntValue": 1,
  "smallIntValue": 12,
  "intValue": 1234,
  "bigIntValue": 123456789,
  "floatValue": 12.34,
  "doubleValue": 1234.5678,
  "decimalValue": 1.123,
  "dateValue": "2019-12-26",
  "timeValue": "16:15:14",
  "timestampValue": "2011-12-03 10:15:30",
  "timestampLtzValue": "2021-05-31 16:15:14.528Z",
  "bytesValue": "aG93ZHk=",
  "arrayValue": [
    "News",
    "Travel"
  ],
  "mapValue": {
    "count": 17
  },
  "structValue": {
    "col1": 1234
  }
}
// output record
{
  "booleanValue": true,
  "stringValue": "howdy",
  "tinyIntValue": 1,
  "smallIntValue": 12,
  "intValue": 1234,
  "bigIntValue": 123456789,
  "floatValue": 12.34,
  "doubleValue": 1234.5678,
  "decimalValue": 1.123,
  "dateValue": "2019-12-26",
  "timeValue": "16:15:14",
  "timestampValue": "2011-12-03 10:15:30",
  "timestampLtzValue": "2021-05-31 16:15:14.528Z",
  "bytesValue": "aG93ZHk=",
  "arrayValue": [
    "News",
    "Travel"
  ],
  "mapValue": {
    "count": 17
  },
  "structValue": {
    "col1": 1234
  }
}

Partial Record Example

When JSON records are missing fields specified by the CREATE STREAM DDL statement, those fields are given the value NULL in the output record. In the opposite case — when JSON records have fields that aren’t specified by the CREATE STREAM DDL statement — those fields are ignored.

With the query:

SELECT "booleanValue", "stringValue", "intValue" FROM jsonExample;
// input record
{
  "booleanValue": true,
  "stringValue": "howdy",
  "someOtherValue": 123
}

// output record
{
  "booleanValue": true,
  "stringValue": "howdy",
  "intValue": null
}

Mismatched Types Example

Boolean

With the query:

SELECT "booleanValue" FROM jsonExample;
// input record
{ "booleanValue": true }

//output record
{ "booleanValue": true }
// input record
{ "booleanValue": false }

//output record
{ "booleanValue": false }
// input record
{ "booleanValue": "true" }

//output record
{ "booleanValue": true }
// input record
{ "booleanValue": "false" }

//output record
{ "booleanValue": false }
// input record
{ "booleanValue": "abc" }

//output record
{ "booleanValue": false }
// input record
{ "booleanValue": 123 }

//output record
{ "booleanValue": false }

Character String

With the query:

SELECT "stringValue" FROM jsonExample;
// input record
{ "stringValue": "abc" }

//output record
{ "stringValue": "abc" }
// input record
{ "stringValue": true }

//output record
{ "stringValue": "true" }
// input record
{ "stringValue": 123 }

//output record
{ "stringValue": "123" }
// input record
{ "stringValue": 123.456 }

//output record
{ "stringValue": "123.456" }

Numeric

With the query:

SELECT "tinyIntValue", "doubleValue", "decimalValue" FROM jsonExample;
// input record
{ "tinyIntValue": 1, "doubleValue": 123.1, "decimalValue": 1.123 }

// output record
{ "tinyIntValue": 1, "doubleValue": 123.1, "decimalValue": 1.123 }
// input record
{ "tinyIntValue": "1", "doubleValue": "123.1", "decimalValue": "1.123" }

// output record
{ "tinyIntValue": 1, "doubleValue": 123.1, "decimalValue": 1.123 }
// input record ("decimalValue" value is larger than defined precision)
{ "tinyIntValue": 1, "doubleValue": 123.1, "decimalValue": 12.123 }

// output record
{ "tinyIntValue": 1, "doubleValue": 123.1, "decimalValue": null }
// input record
{ "tinyIntValue": "abc", "doubleValue": 123.1, "decimalValue": 1.123 }

// deserialization error because String can't be cast to numeric value
// input record
{ "tinyIntValue": 130, "doubleValue": 123.1, "decimalValue": 1.123 }

// deserialization error because 130 is out of range for TINYINT values
// input record
{ "tinyIntValue": 1.1, "doubleValue": 123.1, "decimalValue": 1.123 }

// deserialization error because TINYINT cannot be a floating point value

Date and Time

SELECT 
  "dateValue", 
  "timeValue", 
  "timestampValue", 
  "timestampLtzValue" 
FROM 
  jsonExample;
// input record
{
  "dateValue": "2019-12-26",
  "timeValue": "16:15:14",
  "timestampValue": "2011-12-03 10:15:30",
  "timestampLtzValue": "2021-05-31 16:15:14.528Z"
}

// output record
{
  "dateValue": "2019-12-26",
  "timeValue": "16:15:14",
  "timestampValue": "2011-12-03 10:15:30",
  "timestampLtzValue": "2021-05-31 16:15:14.528Z"
}
// input record
{ "dateValue": 1234 }

// deserialization error because date and time values must be parsed from Strings
// input record
{ "dateValue": "2019-04-31" }

// deserialization error due to invalid date (April only has 30 days)

Binary String

With the query:

SELECT "bytesValue" FROM jsonExample;
// input record
{ "bytesValue": "aG93ZHk=" }

// output record
{ "bytesValue": "aG93ZHk=" }
// input record
{ "bytesValue": 1 }

// deserialization error because we expect a String value

Constructed Data Types

With the query:

SELECT "arrayValue", "mapValue", "structValue" FROM jsonExample;
// input record
{
  "arrayValue": [
    "News",
    "Travel"
  ],
  "mapValue": {
    "count": 17
  },
  "structValue": {
    "col1": 1234
  }
}

// output record
{
  "arrayValue": [
    "News",
    "Travel"
  ],
  "mapValue": {
    "count": 17
  },
  "structValue": {
    "col1": 1234
  }
}

Array

// input record
{ "arrayValue": [] }

// output record
{
  "arrayValue": [],
  "mapValue": null,
  "structValue": null
}
// input record
{
  "arrayValue": [ 17 ]
}

// output record
{
  "arrayValue": [ "17" ],
  "mapValue": null,
  "structValue": null
}
// input record
{
  "arrayValue": []
}

// output record
{
  "arrayValue": [],
  "mapValue": null,
  "structValue": null
}

Map

// input record
{
  "mapValue": {
    "count": 17,
    "index": 102
  }
}

// output record
{
  "arrayValue": null,
  "mapValue": {
    "count": 17,
    "index": 102
  },
  "structValue": null
}
// input record
{
  "mapValue": {
    "count": "howdy"
  }
}

// deserialization error because String can't be cast to numeric value
// input record
{
  "mapValue": {}
}

// output record
{
  "arrayValue": null,
  "mapValue": {},
  "structValue": null
}

Struct

// input record
{
  "structValue": {
    "col1": 1234,
    "col2": 5678
  }
}

// output record
{
  "arrayValue": null,
  "mapValue": null,
  "structValue": {
    "col1": 1234
  }
}
// input record
{
  "structValue": {
    "col2": 5678
  }
}

// output record
{
  "arrayValue": null,
  "mapValue": null,
  "structValue": {
    "col1": null
  }
}
// input record
{
  "structValue": {}
}

// output record
{
  "arrayValue": null,
  "mapValue": null,
  "structValue": {
    "col1": null
  }
}
#_stream
#_changelog