Skip to content

Conversation

@dgokeeffe
Copy link

Summary

Adds Databricks Foundation Model APIs as a new provider, enabling opencode users to connect to their Databricks workspace's pay-per-token LLM endpoints.

Fixes #7983

Changes

  • Provider implementation (provider.ts): Full Databricks provider with OpenAI-compatible endpoint support at /serving-endpoints
  • Auth guidance (auth.ts): Added Databricks to auth login flow with clear authentication instructions
  • Test cleanup (preload.ts): Clear Databricks env vars between tests
  • Unit tests (databricks.test.ts): 12 tests covering config parsing, auth precedence, URL handling, and model capabilities

Authentication Methods

Supports three auth methods (in priority order):

  1. PAT token via DATABRICKS_TOKEN or opencode auth login
  2. OAuth M2M via DATABRICKS_CLIENT_ID + DATABRICKS_CLIENT_SECRET
  3. Azure AD Service Principal via ARM_CLIENT_ID + ARM_CLIENT_SECRET + ARM_TENANT_ID

Default Models

Includes default definitions for common Foundation Model API endpoints (Claude, Llama, GPT-5, Gemini). Users can add custom model endpoints via opencode.json.

Verification

  • All 12 new tests pass: bun test packages/opencode/test/provider/databricks.test.ts
  • Tested locally with PAT authentication against a Databricks workspace

@github-actions
Copy link
Contributor

The following comment was made by an LLM, it may be inaccurate:

No duplicate PRs found

Add Databricks Foundation Model APIs as a new provider supporting
OpenAI-compatible endpoints via /serving-endpoints.

Authentication methods (in priority order):
- Personal Access Token (DATABRICKS_TOKEN or stored auth)
- OAuth M2M (DATABRICKS_CLIENT_ID + DATABRICKS_CLIENT_SECRET)
- Azure AD Service Principal (ARM_CLIENT_ID/SECRET/TENANT_ID)
- Azure CLI (for Azure Databricks workspaces)

Includes default model definitions for Claude, Llama, GPT-5, and
Gemini models available through Databricks pay-per-token endpoints.

Fixes model transformation to use Provider.Model format with proper
capabilities and API configuration.
Add optional host field to API authentication schema to support
providers like Databricks that require both workspace URL and API key.

Update ProviderAuth.api to accept optional host parameter and modify
Databricks provider loader to prioritize stored auth host over config
or environment variables for workspace URL resolution.
Implement DatabricksApiMethod component that prompts users for both
workspace URL and Personal Access Token in sequence, eliminating the
need for environment variables or custom configuration files.

Enhance 'opencode auth login' to prompt for Databricks workspace URL
before API key, storing both values for complete authentication.
Update generated TypeScript SDK types to reflect optional host field
in API authentication schema.
- Make description prop a function in DatabricksApiMethod
- Add escape key support to cancel dialog
- Prevent default behavior on return key press
@dgokeeffe dgokeeffe force-pushed the feat/databricks-provider branch from 0c9f889 to 7ae4802 Compare January 17, 2026 03:26
@cbcoutinho
Copy link

cbcoutinho commented Jan 17, 2026

@dgokeeffe I am really looking forward to this PR landing, especially after seeing your post on LinkedIn regarding running opencode on a cluster via databricks ssh .... Great work!

I'm interested in running this locally, although without a Databricks PAT if possible. Can you provide a comment regarding auth via azure-cli or databricks-cli?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support for Databricks Foundation Model APIs provider

2 participants