Skip to main content

Databricks MCP server

Databricks is a data lakehouse platform that unifies data engineering, analytics, and machine learning. With this MCP server, AI agents can manage clusters, orchestrate ETL jobs, run SQL queries, manage workspaces, and automate data pipelines through natural language commands.

Setting up an MCP server

This article covers the standard steps for creating an MCP server in AI Gateway and connecting it to an AI client. The steps are the same for every integration — application-specific details (API credentials, OAuth endpoints, and scopes) are covered in the individual application pages.

Before you begin

You'll need:

  • Access to AI Gateway with permission to create MCP servers
  • API credentials for the application you're connecting (see the relevant application page for what to collect)

Create an MCP server

Find the API in the catalog

  1. Sign in to AI Gateway and select MCP Servers from the left navigation.
  2. Select New MCP Server.
  3. Search for the application you want to connect, then select it from the catalog.

Configure the server

  1. Enter a Name for your server — something descriptive that identifies both the application and its purpose (for example, "Zendesk Support — Prod").
  2. Enter a Description so your team knows what the server is for.
  3. Set the Timeout value. 30 seconds works for most APIs; increase to 60 seconds for APIs that return large payloads.
  4. Toggle Production mode on if this server will be used in a live workflow.
  5. Select Next.

Configure authentication

Enter the authentication details for the application. This varies by service — see the Authentication section of the relevant application page for the specific credentials, OAuth URLs, and scopes to use.

Configure security

  1. Set any Rate limits appropriate for your use case and the API's own limits.
  2. Enable Logging if you want AI Gateway to record requests and responses for auditing.
  3. Select Next.

Deploy

Review the summary, then select Deploy. AI Gateway provisions the server and provides a server URL you'll use when configuring your AI client.


Connect to an AI client

Once your server is deployed, you'll need to add it to the AI client your team uses. Select your client for setup instructions:

Tips

  • You can create multiple MCP servers for the same application — for example, a read-only server for reporting agents and a read-write server for automation workflows.
  • If you're unsure which OAuth scopes to request, start with the minimum read-only set and add write scopes only when needed. Most application pages include scope recommendations.
  • You can edit a server's name, description, timeout, and security settings after deployment without redeploying.

Authentication

Databricks supports OAuth 2.0 authentication. Configure it in the Databricks Account Console by registering an app connection and obtaining OAuth credentials.

For Databricks Cloud (AWS/GCP):

  • Authorization URL: https://accounts.cloud.databricks.com/oidc/accounts/{account-id}/v1/authorize
  • Token URL: https://accounts.cloud.databricks.com/oidc/accounts/{account-id}/v1/token
  • Scopes: all-apis offline_access

For Azure Databricks:

  • Authorization URL: https://{workspace-url}/oidc/v1/authorize
  • Token URL: https://{workspace-url}/oidc/v1/token
  • Scopes: user_impersonation offline_access

Replace placeholders with your actual Account ID and workspace URL.

Available tools

The Databricks MCP server exposes cluster management, job orchestration, workspace operations, DBFS access, SQL queries, and user management APIs.

ToolPurpose
Cluster ManagementCreate, start, stop, and configure clusters; monitor status and events; resize and scale clusters
Job OrchestrationCreate, schedule, and run jobs; monitor runs and get logs; manage job configurations
Workspace OperationsImport and export notebooks; manage folders; organize workspace structure
DBFS OperationsUpload and download files; manage directories; transfer data between systems
SQL WarehousesQuery data with SQL; manage warehouse lifecycle; create dashboards and reports
User & Group ManagementCreate and update users; manage group membership; configure permissions and roles
Secrets ManagementCreate secret scopes; store and manage credentials; control access to secrets

Tips

Use auto-scaling to handle variable workloads efficiently and set auto-termination to avoid unnecessary costs.

Choose appropriate instance types for your workload.

Schedule jobs during off-peak hours when possible.

Implement error handling with retries and use conditional logic to skip steps when unnecessary.

Create consistent folder structures for team collaboration and use naming conventions for notebooks and jobs.

Maintain an archive folder for old work.

Batch small files before uploading to improve performance.

Use /FileStore/ for permanent storage.

Regularly clean up temporary files.

Store API keys and credentials in secret scopes rather than notebooks.

Use role-based access controls for secrets and rotate credentials regularly.