Skip to main content
Use Databricks as a destination in Mage to load structured data into your Databricks workspace. Mage integrates with Databricks SQL warehouses and uses Unity Catalog for table management. This destination is ideal for exporting transformed pipeline data to Databricks for analytics, ML workloads, or data lakehouse operations.

Required Configuration

Provide the following credentials when configuring Databricks as a destination:
KeyDescriptionExample ValueRequired
access_tokenPersonal access token to authenticate with Databricksdapi123abc...
server_hostnameHostname of your Databricks workspacedbc-123456.cloud.databricks.com
http_pathHTTP path of the Databricks SQL warehouse or cluster/sql/1.0/warehouses/abc123
schemaName of the schema (database) within the cataloganalytics
tableName of the table to write data intouser_events

Optional Configuration

KeyDescriptionExample ValueDefault
catalogName of the Unity Catalog to useworkspace-
skip_schema_creationIf true, Mage won’t run CREATE SCHEMA. Useful if the schema already exists.truefalse
lower_caseIf true, all column names will be lowercased.truetrue
allow_reserved_wordsIf true, Mage will allow use of SQL reserved words as column names.falsefalse

Notes

  • Unity Catalog: Set catalog to your Unity Catalog name. This is an optional field for table management in Databricks.
  • Access Token: Generate a personal access token from your Databricks workspace settings.
  • HTTP Path: Find the HTTP path in your SQL warehouse or cluster connection details.
  • Permissions: Ensure your access token has permissions to create schemas and write to tables in the specified catalog and schema.