Configure Databricks Account
Overview
You must create to connect to data sources that you want to use in your pipelines. You can configure your account from SnapLogic Platform using the Designer or Manager.
| Target Database | Supported Cloud Location | Cloud Location in JDBC URL |
|---|---|---|
| Databricks Lakehouse Platform (DLP) | AWS | jdbc:spark://<your_instance_code>.cloud.databricks.com or jdbc:databricks://<your_instance_code>.cloud.databricks.com |
| Microsoft Azure | jdbc:spark://<your_instance_code>.azuredatabricks.net or jdbc:databricks://<your_instance_code>.azuredatabricks.net |
Limitations
-
With the basic authentication type for Databricks Lakehouse Platform (DLP) reaching its end of life on July 10, 2024, SnapLogic Databricks pipelines designed to use this authentication type to connect to DLP instances would cease to succeed. We recommend that you reconfigure the corresponding Snap accounts to use the Personal access tokens (PAT) authentication type.
Supported JDBC JAR Version
databricks-jdbc-2.6.40) in your pipelines.
However, you may choose to use a custom JAR file version. We recommend that you let the
Snaps use the listed JAR file versions. However, you may choose a different JAR file
version. Snap-account compatibility
All Snaps in the Databricks Snap Pack work with Databricks Account :
| Snaps | Databricks Account |
|---|---|
| Databricks - Select | |
| Databricks - Insert | |
| Databricks - Delete | |
| Databricks - Bulk Load | |
| Databricks - Merge Into | |
| Databricks - Execute |
Account encryption
| Standard Encryption | If you are using Standard Encryption, follow the High sensitivity settings under Enhanced Encryption. |
| Enhanced Encryption | If you have the Enhanced Account Encryption feature, the following describes
which fields are encrypted for each sensitivity level selected for this account:
|