Prerequisites

RequirementPurpose
Unity Catalog enabledExposes Databricks system tables for querying.
Metastore admin roleNeeded to enable system schemas.
Databricks CLIUsed to enable schemas and list workspace metadata.

Step-by-Step Setup

1

Enable Unity Catalog

Ensure Unity Catalog is active for the target workspace. Follow Databricks’ official documentation if it is not already enabled.

2

Enable System Schemas

3

(Optional) Create a Warehouse

Pelanor can query any existing warehouse, but a small serverless warehouse with auto-stop is recommended for cost efficiency.

4

Create a Service Principal

  1. Open Account Console → Service principals.
  2. Click Add Service principal, assign a clear name, then Generate Secret.
  3. Save the Client ID and Secret—you will enter these in Pelanor.
5

Grant Workspace & Warehouse Access

  1. In Account Console → Workspaces, add the Service Principal to the workspace with User permission.
  2. Inside the workspace, open the warehouse → Permissions → grant Can Use.
  3. Confirm the principal has the Databricks SQL access entitlement.
6

Grant System-Table Privileges

Run the following SQL (replace the placeholder with the Service Principal ID):

-- Compute schema
GRANT USE SCHEMA ON SCHEMA system.compute TO '<service_principal_id>';
GRANT SELECT ON TABLE system.compute.clusters TO '<service_principal_id>';

-- Billing schema
GRANT USE SCHEMA ON SCHEMA system.billing TO '<service_principal_id>';
GRANT SELECT ON TABLE system.billing.list_prices TO '<service_principal_id>';
GRANT SELECT ON TABLE system.billing.usage TO '<service_principal_id>';

-- Lakeflow schema
GRANT USE SCHEMA ON SCHEMA system.lakeflow TO '<service_principal_id>';
GRANT SELECT ON TABLE system.lakeflow.job_run_timeline TO '<service_principal_id>';
GRANT SELECT ON TABLE system.lakeflow.job_task_run_timeline TO '<service_principal_id>';
GRANT SELECT ON TABLE system.lakeflow.jobs TO '<service_principal_id>';

Additional Notes

  • Multiple Workspaces – Granting access in one workspace lets Pelanor collect data for all spend in the Databricks account—no per-workspace connection needed.
  • Pricing Source – The current adaptor uses Databricks list prices. Custom price books are not yet supported.