Databricks
Last updated
Last updated
is a lakehouse platform in the cloud. Utilizing technologies such as Apache Spark, Delta Lake ,and MLflow, Databricks combines the functionality of data warehouses and data lakes to offer an open and unified platform for data and AI.
This document walks you through setting up Databricks to be used as a data Store in DeltaStream.
(steps 1 and 2) or use an existing Databricks workspace.
Have an AWS account whose S3 hosts your Delta Lake data. If you don't have an account, you can sign up for a .
Navigate to your Databricks workspace.
In the top right of the screen, click down on your account name and select User Settings.
In the menu bar that displays, click Developer, and under Access Tokens, click Manage.
Click Generate new token. Add an optional comment for the token and thenchoose a lifetime for the token. Then click Generate to create the token.
Navigate to your Databricks workspace.
In the lefthand navigation click the SQL Warehouses. A list displays of the existing SQL warehouses in your workspace. Databricks creates a starter warehouse for you.
To create a new SQL warehouse, click Create SQL warehouse. To edit an existing SQL warehouse, to the right of the warehouse you want, click the 3 vertical dots. Then click Edit.
To create a new AWS S3 bucket:
Click Create bucket.
Enter a name for your S3 bucket and then at the bottom click Create bucket to create your new S3 bucket.
Navigate to your Databricks workspace.
At the top of the page, click + Add, and from the list that displays click Add an external location.
Click AWS Quickstart to set up the Databricks and S3 connection, and then click Next. Advanced users can opt to set up their external location manually instead, but this tutorial continues with the AWS Quickstart option.
Enter the name of an existing S3 bucket to link to your Databricks workspace. Then click Generate new token. Copy that token, then click Launch in Quickstart. This brings you back to the AWS console and displays a page called Quick create stack.
On the the AWS Quick create stck" page, in the Databricks Personal Access Token field, enter the access token you copied in step 5. Then at the bottom of the page, click to acknowledge that AWS CloudFormation might create IAM resources with custom names. Then click Create stack to launch stack initialization.
In a few minutes, you'll see the stack creation complete.
This step is relevant if you receive an error message such as Metastore Storage Root URL Does Not Exist
. In this case:
Ensure you have an S3 bucket to use for metastore-level managed storage in AWS (follow the steps above to create a new S3 bucket). In this case you can use the bucket created in the previous step.
If you're creating a new metastore, click Create metastore and follow the prompts to set the name, region, S3 path, and workspaces for the metastore.
If you're editing an existing metastore, click on the name of the metastore you wish to edit. From this page you can assign new workspaces, set an S3 path, edit the metastore admin, and take other actions.
From the menu that displays, click Databricks. The Add Store window opens.
Enter the authentication and connection parameters. These include:
Store Name – A unique name to identify your DeltaStream store. (For more details see Store). Store names are limited to a maximum of 255 characters. Only alphanumeric characters, dashes, and underscores are allowed.
Store Type – Databricks.
Access Region – DeltaStream access region to associate with the store; indicates where data is stored or streams through. (For more details see Region).
Warehouse ID – The ID for a Databricks SQL warehouse in your Databricks workspace. (For more details see Add Databricks SQL Warehouse).
Databricks Cloud Region – The AWS region in which the Cloud S3 Bucket exists.
Cloud S3 Bucket – An AWS S3 bucket that is connected as an external location in your Databricks workspace (see #add-s3-bucket-as-external-location-for-data).
App Token – The Databricks access token for your user in your Databricks workspace. (For more details see #create-databricks-app-token.)
Access Key ID – Access key associated with the AWS account in which the Cloud S3 Bucket exists.
Secret Access Key – Secret access key associated with the AWS account in which the Cloud S3 Bucket exists.
Click Add.
Your Databricks store displays on the Resources page in your list of stores.
Click the Databricks_Test_Store. The store page displays, with the Catalogs tab active. Here you can view a list of the existing catalogs in your Databricks workspace.
To see the schemas that exist in a particular catalog, click the catalog you want.
(Optional) Create a new schema. To do this:
Select + Add Schema. In the window that displays, enter a name for the new schema and then click Add. The new schema now displays in the list.
To see the tables that exist under a particular schema, click the schema you want.
Click Run.
To view the new table created by the above CTAS, navigate to databricks_store --> Catalogs --> + Add Catalog --> Add Schema --> pageviews. Of course, if you wrote your CTAS such that the store/catalog/schema/table names are different, navigate accordingly.
To view a sample of the data in your Databricks table, click Print.
Verify the save or download the newly-generated token value. You will need this when .
For more details on generating access tokens for a workspace, see the .
Configure your SQL warehouse with your preferred specifications. (To learn more about configuring your SQL warehouse, take a look at the .) For a more optimal experience, we recommend choosing serverless as the SQL warehouse type. .
Click Save to create the SQL warehouse. Record the warehouse ID on the overview page; you will need this ID when . You can also access the warehouse overview by clicking on the name of the SQL warehouse from the SQL Warehouses initial landing page from step 1.
In the , navigate to the S3 page.
For more details, see the Databricks .
In the lefthand navigation, click Catalog. This displays a view of your .
For more information on external locations, see the .
Navigate to the . From here, either create a new metastore or edit existing metastores.
For more information on creating a Unity Catalog Metastore, see the .
Open DeltaStream. In the lefthand navigation, click Resources ( ) and then click Add Store +.
URL – URL for Databricks workspace. Find this by navigating to the and clicking the workspace you wish to use.
For the steps below, assume you already have a called pageviews defined, which is backed by a topic in Kafka. Assume also there is a Databricks store labelled Databricks_Test_Store. (For more details see Adding Databricks as a DeltaStream Store.) Now perform a simple filter on the pageviews stream and sink the results into Databricks.
In the lefthand navigation, click Resources ( ). This displays a list of the existing stores.
(Optional) Create a new catalog. To do this, click + Add Catalog. When prompted, enter a name for the new catalog and click Add. The new catalog displays in the list.
Important If you receive this error message -- Metastore Storage Root URL Does Not Exist
-- verify that you've properly .
In the lefthand navigation, click Workspace ( ).
In the SQL pane of your workspace, write the query to ingest from pageviews and output to a new table titled pv_table.
In the lefthand navigation click Queries ( ) to see the existing queries, including the query from the step immediately prior. It may take a few moments for the query to transition into the Running state. Keep refreshing your screen until the query transitions.
In the lefthand navigation, click Resources ( ). This displays a list of the existing stores.