Crush supports multiple LLM providers including Anthropic, OpenAI, Google Gemini, AWS Bedrock, Azure, and many more. You can configure providers using environment variables or configuration files.Documentation Index
Fetch the complete documentation index at: https://mintlify.com/charmbracelet/crush/llms.txt
Use this file to discover all available pages before exploring further.
Quick Start with Environment Variables
The quickest way to get started is to set an API key for your preferred provider. Crush recognizes these environment variables:| Environment Variable | Provider |
|---|---|
ANTHROPIC_API_KEY | Anthropic |
OPENAI_API_KEY | OpenAI |
VERCEL_API_KEY | Vercel AI Gateway |
GEMINI_API_KEY | Google Gemini |
SYNTHETIC_API_KEY | Synthetic |
ZAI_API_KEY | Z.ai |
MINIMAX_API_KEY | MiniMax |
HF_TOKEN | Hugging Face Inference |
CEREBRAS_API_KEY | Cerebras |
OPENROUTER_API_KEY | OpenRouter |
IONET_API_KEY | io.net |
GROQ_API_KEY | Groq |
VERTEXAI_PROJECT | Google Cloud VertexAI (Gemini) |
VERTEXAI_LOCATION | Google Cloud VertexAI (Gemini) |
AWS_ACCESS_KEY_ID | Amazon Bedrock (Claude) |
AWS_SECRET_ACCESS_KEY | Amazon Bedrock (Claude) |
AWS_REGION | Amazon Bedrock (Claude) |
AWS_PROFILE | Amazon Bedrock (Custom Profile) |
AWS_BEARER_TOKEN_BEDROCK | Amazon Bedrock |
AZURE_OPENAI_API_ENDPOINT | Azure OpenAI models |
AZURE_OPENAI_API_KEY | Azure OpenAI models (optional when using Entra ID) |
AZURE_OPENAI_API_VERSION | Azure OpenAI models |
Custom Provider Configuration
For advanced scenarios, you can configure custom providers in yourcrush.json file.
OpenAI-Compatible Providers
Crush supports two OpenAI provider types:
openai- For proxying/routing requests through OpenAIopenai-compat- For non-OpenAI providers with OpenAI-compatible APIs
Anthropic-Compatible Providers
For providers that use the Anthropic API format:Local Models
You can run Crush with local models using OpenAI-compatible servers.Ollama
LM Studio
Cloud Providers
Amazon Bedrock
Crush supports running Anthropic models through Bedrock (caching disabled):- Configure AWS credentials:
aws configure - Set
AWS_REGIONorAWS_DEFAULT_REGIONenvironment variable - (Optional) Use a specific profile:
AWS_PROFILE=myprofile crush - Alternative: Set
AWS_BEARER_TOKEN_BEDROCKinstead of runningaws configure
Vertex AI Platform
Vertex AI appears whenVERTEXAI_PROJECT and VERTEXAI_LOCATION are set:
Model Configuration Options
When defining models in your provider configuration, you can specify:- id (required) - Model identifier used by the provider API
- name (required) - Human-readable model name
- cost_per_1m_in - Cost per 1M input tokens
- cost_per_1m_out - Cost per 1M output tokens
- cost_per_1m_in_cached - Cost per 1M cached input tokens
- cost_per_1m_out_cached - Cost per 1M cached output tokens
- context_window - Maximum context window size
- default_max_tokens - Default maximum tokens for responses
- can_reason - Whether the model supports extended reasoning
- supports_attachments - Whether the model supports file attachments
Provider Auto-Updates
By default, Crush automatically updates the provider database from Catwalk, the open source Crush provider database.Disabling Auto-Updates
For air-gapped or restricted environments:Manual Updates
Update providers manually:Environment Variable Expansion
In configuration files, you can reference environment variables using the$VARIABLE_NAME syntax:
Next Steps
Model Selection
Learn how to list and select models
Permissions
Configure tool permissions