Providers
Version: 0.4.0
Providers are DecisionBox's plugin system for external services. Instead of hardcoding support for specific LLMs, warehouses, or secret managers, DecisionBox defines interfaces and lets provider packages implement them.
The Pattern
All three provider types follow the same pattern:
- Interface — Defined in
libs/go-common/(e.g.,llm.Provider) - Registry — Central map of name → factory function
- Registration — Provider packages call
Register()in theirinit()function - Selection — Services create providers by name at runtime
// 1. Interface (libs/go-common/llm/provider.go)
type Provider interface {
Chat(ctx context.Context, req ChatRequest) (*ChatResponse, error)
Validate(ctx context.Context) error
}
// 2. Registry (libs/go-common/llm/registry.go)
func Register(name string, factory ProviderFactory) { ... }
func NewProvider(name string, cfg ProviderConfig) (Provider, error) { ... }
// 3. Registration (providers/llm/claude/provider.go)
func init() {
llm.Register("claude", func(cfg llm.ProviderConfig) (llm.Provider, error) {
return NewClaudeProvider(cfg["api_key"], cfg["model"])
})
}
// 4. Selection (services/agent/main.go)
import _ "github.com/decisionbox-io/decisionbox/providers/llm/claude"
provider, err := llm.NewProvider("claude", llm.ProviderConfig{
"api_key": apiKey,
"model": "claude-sonnet-4-20250514",
})
The blank import (import _ "...") triggers the init() function which registers the provider. The service then creates it by name.
Provider Metadata
Each provider registers metadata alongside its factory function. This metadata powers the dashboard's dynamic forms — no hardcoded provider lists.
llm.RegisterWithMeta("claude", factory, llm.ProviderMeta{
Name: "Claude (Anthropic)",
Description: "Anthropic Claude API - direct access",
ConfigFields: []llm.ConfigField{
{Key: "api_key", Label: "API Key", Required: true, Type: "string", Placeholder: "sk-ant-..."},
{Key: "model", Label: "Model", Required: true, Type: "string", Default: "claude-sonnet-4-6"},
},
Models: []llm.ModelEntry{
{
ID: "claude-opus-4-7",
Aliases: []string{"opus-4-7"},
DisplayName: "Claude Opus 4.7",
Wire: llm.WireAnthropic,
MaxOutputTokens: 128000,
Pricing: llm.TokenPricing{InputPerMillion: 5.0, OutputPerMillion: 25.0},
},
{
ID: "claude-sonnet-4-6",
Aliases: []string{"sonnet-4-6"},
DisplayName: "Claude Sonnet 4.6",
Wire: llm.WireAnthropic,
MaxOutputTokens: 64000,
Pricing: llm.TokenPricing{InputPerMillion: 3.0, OutputPerMillion: 15.0},
},
},
DefaultMaxOutputTokens: 16384,
SupportsTools: true,
})
The API returns this metadata via GET /api/v1/providers/llm and GET /api/v1/providers/warehouse. The dashboard renders dynamic configuration forms from the ConfigFields array — no UI code changes needed when a new provider is added.
Each ModelEntry is the single source of truth for the model's wire format, output-token cap, list pricing, and dashboard display name. Aliases lets one entry be reached by many ID strings — used to capture every cloud-side variant of the same underlying model. On Bedrock for example a single Opus 4.7 entry has its canonical anthropic.claude-opus-4-7-v1:0 plus 21 aliases covering every cross-region inference profile (us. / eu. / apac. / jp. / au. / global.), every Bedrock version-suffix variant (-v1:0, -v1, no suffix), and the family-only short forms (claude-opus-4-7, opus-4-7).
The agent uses gollm.GetMaxOutputTokens(providerName, model) to cap completions at the model's published output limit. The lookup walks the catalog matching against ID and aliases; misses fall back to DefaultMaxOutputTokens, then to a global 8192 floor.
The dashboard combobox shows one row per canonical ID — aliases stay internal to the resolver so the picker isn't doubled.
Three Provider Types
LLM Providers
Interface: llm.Provider — Two methods: Chat(ctx, request) → response and Validate(ctx) → error
Purpose: Send prompts to an AI model and get text responses.
Validate checks credentials and model access using lightweight API calls (e.g., list models) without consuming tokens.
Used by the "Test Connection" button in the dashboard.
| Provider | ID | Auth | Models |
|---|---|---|---|
| Anthropic Claude | claude | API key | claude-sonnet-4-6, claude-opus-4-6, claude-haiku-4-5 |
| OpenAI | openai | API key | gpt-5, gpt-4.1, gpt-4o, o3, o4-mini |
| Ollama | ollama | None (local) | Any model: llama3.1, qwen2.5, mistral, etc. |
| Google Vertex AI | vertex-ai | GCP ADC | Gemini, Claude, Llama/Qwen/DeepSeek/Mistral MaaS |
| AWS Bedrock | bedrock | AWS credentials | Claude, Qwen, DeepSeek, Mistral, Llama |
| Azure AI Foundry | azure-foundry | API key | Claude, GPT-5/4.1/4o, Mistral |
Model catalog and wire dispatch
Cloud providers (Bedrock, Vertex AI, Azure AI Foundry) serve many models behind a single endpoint — but different model families speak different wire formats. Each provider owns its catalog inline as ProviderMeta.Models []ModelEntry:
- Every shipped model is registered with its wire (
WireAnthropic,WireOpenAICompat, orWireGoogleNative), max output tokens, list pricing, and aliases. - Provider
Chat()callsmeta.ResolveWire(model, wireOverride)which walks the catalog (canonical ID + aliases), then falls back to project-levelllm.config.wire_override, then to a per-providerFamilyInferrer(prefix table) for newly-released models in known families. No pattern matching on model names in the dispatch path. - Models that miss every tier return an actionable error naming the provider, the model, and the supported wires for that cloud.
Shared helpers live alongside the catalog:
libs/go-common/llm/openaicompat— request/response types + typedAPIErrorfor any provider speaking the OpenAI/chat/completionswire. Used byopenai,azure-foundry(OpenAI path),bedrock(non-Anthropic path), andvertex-ai(MaaS path).
Adding a new model to an existing cloud is a single ModelEntry in the provider's catalog.go — no provider code change. See Adding LLM Providers.
Location: providers/llm/{provider-name}/
Config: Passed as map[string]string. Common fields:
api_key— API key (Claude, OpenAI)model— Model identifiertimeout_seconds— Per-call timeout (default: 300)- Provider-specific:
project_id+location(Vertex AI),region(Bedrock),host(Ollama)
See Adding LLM Providers to implement your own.
Warehouse Providers
Interface: warehouse.Provider — Query execution, table listing, schema inspection.
Purpose: Execute SQL queries against a data warehouse (read-only).
| Provider | ID | Auth Methods | SQL Dialect |
|---|---|---|---|
| Google BigQuery | bigquery | ADC, Service Account Key | BigQuery Standard SQL |
| Amazon Redshift | redshift | IAM Role, Access Keys, Assume Role | PostgreSQL-compatible |
| Snowflake | snowflake | Password, Key Pair (JWT) | Snowflake SQL |
| PostgreSQL | postgres | Password, Connection String | PostgreSQL |
| Databricks | databricks | PAT, OAuth M2M | Databricks SQL |
| Microsoft SQL Server | mssql | SQL Login, Connection String | T-SQL (SQL Server 2016+, Azure SQL) |
Location: providers/warehouse/{provider-name}/
Interface methods:
| Method | Purpose |
|---|---|
Query(ctx, sql, params) | Execute a SQL query, return rows |
ListTables(ctx) | List all tables in default dataset |
ListTablesInDataset(ctx, dataset) | List tables in a specific dataset |
GetTableSchema(ctx, table) | Get column names, types, nullable |
GetTableSchemaInDataset(ctx, dataset, table) | Get schema for a specific dataset.table |
GetDataset() | Return default dataset name |
SQLDialect() | Return SQL dialect description (rendered into exploration / verification prompts via the {{DIALECT}} placeholder so the LLM emits dialect-correct SQL on the first try). |
QuoteRef(parts...) | Return a dialect-correct fully-qualified identifier (e.g. `dataset`.`table` for BigQuery / Databricks, "dataset"."table" for PostgreSQL / Redshift / Snowflake, [dataset].[table] for SQL Server). Used by the orchestrator to render {{REF:tablename}} placeholders in prompts, by schema_discovery.go's fallback sample query, and by the insight validator to render example table refs. Each part is quoted individually with the dialect's native delimiter; the helper warehouse.QuotePartsWith(open, close, parts) colocated in provider.go is the recommended implementation. |
SQLFixPrompt() | Return warehouse-specific SQL fix instructions. The template must declare the {{DATASET}}, {{ORIGINAL_SQL}}, {{ERROR_MESSAGE}}, {{SCHEMA_INFO}}, {{FILTER}}, and {{CONVERSATION_HISTORY}} placeholders, plus a conditional {{#VERIFICATION_CONTEXT}}…{{/VERIFICATION_CONTEXT}} block carrying any warehouse-specific phrasing of the column-grounding rule (the section is stripped from the rendered prompt when the validator-side fixer call passes empty FixOpts). The provider's provider_test.go should assert all these markers are present so a missed template never silently strips column grounding for that warehouse. |
ValidateReadOnly(ctx) | Verify read-only access works |
HealthCheck(ctx) | Check warehouse connectivity |
Close() | Clean up connections |
Optional interface: warehouse.CostEstimator — For providers that support dry-run cost estimation (BigQuery does, Redshift partially).
type CostEstimator interface {
DryRun(ctx context.Context, query string) (*DryRunResult, error)
}
See Adding Warehouse Providers to implement your own.
Secret Providers
Interface: secrets.Provider — Get, Set, List secrets (no Delete).
Purpose: Store and retrieve encrypted per-project secrets (API keys, credentials).
| Provider | ID | Auth | Storage |
|---|---|---|---|
| MongoDB (default) | mongodb | Encryption key env var | Encrypted MongoDB collection |
| Google Cloud | gcp | GCP ADC | GCP Secret Manager |
| AWS | aws | AWS credentials | AWS Secrets Manager |
| Azure | azure | Azure DefaultAzureCredential | Azure Key Vault |
Location: providers/secrets/{provider-name}/
Interface methods:
| Method | Purpose |
|---|---|
Get(ctx, projectID, key) | Retrieve a secret value |
Set(ctx, projectID, key, value) | Create or update a secret |
List(ctx, projectID) | List secret keys with masked values |
Design decisions:
- No Delete — Secrets are removed manually via cloud console, CLI, or direct database access. This is intentional: preventing accidental deletion via API.
- Per-project scoping — All secrets are namespaced by
{namespace}/{projectID}/{key}. The namespace (default:decisionbox) prevents conflicts in shared cloud accounts. - Masked listing —
List()returns keys with masked values (first 6 + last 4 characters with***in between). Full values are never returned via the API.
See Adding Secret Providers to implement your own.
How Services Use Providers
Agent
The agent imports all providers and selects based on project configuration:
// services/agent/main.go
// Import all providers (triggers init() registration)
import (
_ "github.com/decisionbox-io/decisionbox/providers/llm/claude"
_ "github.com/decisionbox-io/decisionbox/providers/llm/openai"
_ "github.com/decisionbox-io/decisionbox/providers/llm/ollama"
_ "github.com/decisionbox-io/decisionbox/providers/llm/vertex-ai"
_ "github.com/decisionbox-io/decisionbox/providers/llm/bedrock"
_ "github.com/decisionbox-io/decisionbox/providers/llm/azure-foundry"
_ "github.com/decisionbox-io/decisionbox/providers/warehouse/bigquery"
_ "github.com/decisionbox-io/decisionbox/providers/warehouse/redshift"
_ "github.com/decisionbox-io/decisionbox/providers/secrets/mongodb"
_ "github.com/decisionbox-io/decisionbox/providers/secrets/gcp"
_ "github.com/decisionbox-io/decisionbox/providers/secrets/aws"
_ "github.com/decisionbox-io/decisionbox/providers/secrets/azure"
)
// Create providers from project config
secretProv, _ := secrets.NewProvider(secretsCfg)
apiKey, _ := secretProv.Get(ctx, projectID, "llm-api-key")
llmProv, _ := llm.NewProvider(project.LLM.Provider, llm.ProviderConfig{
"api_key": apiKey,
"model": project.LLM.Model,
})
whProv, _ := warehouse.NewProvider(project.Warehouse.Provider, warehouse.ProviderConfig{
"project_id": project.Warehouse.ProjectID,
"dataset": project.Warehouse.Datasets[0],
})
API
The API imports providers for two reasons:
- Metadata — Returns provider lists with config fields for the dashboard
- Pricing — Seeds default pricing from provider registrations
// services/api/main.go
import (
_ "github.com/decisionbox-io/decisionbox/providers/llm/claude"
// ... same imports as agent
)
// GET /api/v1/providers/llm returns:
// [
// { "id": "claude", "name": "Claude (Anthropic)",
// "config_fields": [{"key": "api_key", ...}, {"key": "model", ...}],
// "default_pricing": {"claude-sonnet-4": {"input_per_million": 3.0, ...}} },
// { "id": "openai", ... },
// ...
// ]
Middleware Hooks
In addition to providers, DecisionBox exposes two middleware registration hooks for wrapping existing functionality:
Warehouse Middleware
Wraps the warehouse Provider with custom logic that intercepts calls like ListTables, GetTableSchema, and Query.
warehouse.RegisterMiddleware("my-plugin", func(p warehouse.Provider) warehouse.Provider {
return &myWrapper{inner: p}
})
The agent applies all registered middleware after creating the warehouse provider via warehouse.ApplyMiddleware(provider).
Use cases: query logging, access controls, cost tracking, result redaction.
HTTP Middleware
Wraps all API HTTP requests with custom handlers.
apiserver.RegisterGlobalMiddleware(func(next http.Handler) http.Handler {
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
// pre-processing
next.ServeHTTP(w, r)
// post-processing
})
})
The API server applies all registered middleware via apiserver.ApplyGlobalMiddlewares(handler).
Use cases: request audit logging, custom authentication, route interception.
Project Context
Warehouse middleware can access the project ID via context helpers:
// Set by the agent before calling provider methods
ctx = warehouse.WithProjectID(ctx, projectID)
// Read by middleware in provider method implementations
projectID := warehouse.ProjectIDFromContext(ctx)
Custom Builds
To use middleware hooks, create a custom binary that imports your middleware package and calls the exported Run() function:
package main
import (
_ "my-org/my-plugin" // registers middleware via init()
"github.com/decisionbox-io/decisionbox/services/agent/agentserver"
)
func main() {
agentserver.Run() // or apiserver.Run() for the API
}
Adding a New Provider
To add support for a new LLM, warehouse, or secret manager:
- Create a package in the appropriate
providers/directory - Implement the interface
- Register with metadata in
init() - Import in agent + API
main.go - Write tests
No other code changes needed — the dashboard automatically shows the new provider in dropdowns and renders config forms from metadata.
See the implementation guides:
Next Steps
- Prompts — Template variables and prompt customization
- Domain Packs — How domain-specific analysis works
- Architecture — System overview