- Ask user which OpenRouter model they want to use - Provide link to OpenRouter model catalog (https://openrouter.ai/models) - Guide users to browse, click, and copy model name - Store selected model in ~/.claude/settings.json - Add popular model recommendations table - Document model change process (3 options) - Add model-specific troubleshooting (model not found, context length) - Expand supported models section with examples from multiple providers - Include free tier models (meta-llama with :free suffix) - Add model name accuracy notes (suffixes like :beta, :free) - Update example commands to include model selection scenarios
12 KiB
OpenRouter Config Skill
Description
This skill helps users configure and enable OpenRouter as the AI provider in Claude Code. It allows you to select your preferred OpenRouter model, sets up the API key, configures Anthropic-compatible API endpoints, and ensures Claude Code uses OpenRouter exclusively with proper failover capabilities.
Documentation Source
Based on official OpenRouter documentation: https://openrouter.ai/docs/guides/claude-code-integration
Why Use OpenRouter with Claude Code?
1. Provider Failover for High Availability
Anthropic's API occasionally experiences outages or rate limiting. When you route Claude Code through OpenRouter, your requests automatically fail over between multiple Anthropic providers. If one provider is unavailable or rate-limited, OpenRouter seamlessly routes to another, keeping your coding sessions uninterrupted.
2. Access to 100+ Models
OpenRouter gives you access to models from multiple providers including:
- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
- OpenAI: GPT-4, GPT-4 Turbo, GPT-3.5
- Google: Gemini Pro, Gemini Flash
- Meta: Llama 2, Llama 3
- Mistral: Mixtral, Mistral Large
- And many more: xAI, Perplexity, Cohere, etc.
3. Organizational Budget Controls
For teams and organizations, OpenRouter provides centralized budget management. You can set spending limits, allocate credits across team members, and prevent unexpected cost overruns.
4. Usage Visibility and Analytics
OpenRouter gives you complete visibility into how Claude Code is being used across your team. Track usage patterns, monitor costs in real-time, and understand which projects or team members are consuming the most resources. All of this data is available in your OpenRouter Activity Dashboard.
Usage
To use this skill, simply ask Claude Code to configure OpenRouter. The skill will:
- Ask you which OpenRouter model you want to use
- Provide a link to browse available models
- Guide you to copy the model name from OpenRouter's model catalog
- Set up your OpenRouter API key
- Configure Anthropic-compatible environment variables
- Ensure Claude Code uses OpenRouter exclusively
- Set up proper provider priority
- Verify the configuration is working
Prerequisites
- An OpenRouter API key (get one from https://openrouter.ai/keys)
- Claude Code installed
Model Selection
Step 1: Browse Available Models
Visit the OpenRouter models catalog to choose your preferred model:
- OpenRouter Models: https://openrouter.ai/models
Step 2: Find and Copy Your Model
Browse through the available models and click on any model to see:
- Model name (click to copy)
- Pricing (input/output tokens)
- Context length
- Features (function calling, vision, etc.)
- Provider information
Popular Model Examples:
| Model | Provider | Best For |
|---|---|---|
anthropic/claude-3.5-sonnet |
Anthropic | General coding, reasoning |
anthropic/claude-3.5-sonnet:beta |
Anthropic | Latest Claude features |
openai/gpt-4-turbo |
OpenAI | Fast, cost-effective |
google/gemini-pro-1.5 |
Long context | |
meta-llama/llama-3.1-70b-instruct:free |
Meta | Free tier available |
mistralai/mistral-large |
Mistral | Multilingual |
Step 3: Configure Your Chosen Model
The skill will ask you to paste the model name you selected. You can also change the model later by editing your Claude Code settings.
Configuration Steps
Step 1: Install Claude Code (if not already installed)
# macOS, Linux, WSL:
curl -fsSL https://claude.ai/install.sh | bash
# Windows PowerShell:
irm https://claude.ai/install.ps1 | iex
Step 2: Configure Environment Variables
The skill will help you set these environment variables in your shell profile:
# Add to ~/.zshrc (or ~/.bashrc for Bash, ~/.config/fish/config.fish for Fish)
export OPENROUTER_API_KEY="<your-openrouter-api-key>"
export ANTHROPIC_BASE_URL="https://openrouter.ai/api"
export ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY"
export ANTHROPIC_API_KEY="" # Important: Must be explicitly empty
Important Notes:
- Do NOT put these in a project-level
.envfile - the native Claude Code installer does not read standard.envfiles - Must explicitly blank out
ANTHROPIC_API_KEYto prevent conflicts - If previously logged in to Claude Code with Anthropic, run
/logoutin a Claude Code session to clear cached credentials
Step 3: Configure Your Model
The skill will add your chosen model to your Claude Code settings at ~/.claude/settings.json:
{
"provider": {
"baseUrl": "https://openrouter.ai/api",
"apiKey": "${ANTHROPIC_AUTH_TOKEN}",
"defaultModel": "your-chosen-model-name"
}
}
Step 4: Start Claude Code
cd /path/to/your/project
claude
Step 5: Verify Configuration
Run the /status command inside Claude Code:
/status
Expected output:
Auth token: ANTHROPIC_AUTH_TOKEN
Anthropic base URL: https://openrouter.ai/api
You can also check the OpenRouter Activity Dashboard to see your requests appearing in real-time.
API Configuration
Base URL
https://openrouter.ai/api
Authentication
Use your OpenRouter API key as the auth token.
Model Selection
Choose your model from the OpenRouter model catalog:
- Browse Models: https://openrouter.ai/models
- Click on any model to see details and copy the model name
- Paste the model name when prompted by this skill
Provider Priority
Important: Claude Code with OpenRouter is only guaranteed to work with the Anthropic first-party provider. For maximum compatibility, we recommend setting Anthropic 1P as the top priority provider when using Claude Code.
Supported Models
OpenRouter supports 100+ models from various providers. Browse the full catalog:
- OpenRouter Models: https://openrouter.ai/models
Popular Anthropic Models:
anthropic/claude-3.5-sonnet- Latest, most capableanthropic/claude-3-opus- Best for complex tasksanthropic/claude-3-haiku- Fast, cost-effective
Popular Alternative Models:
openai/gpt-4-turbo- GPT-4 with Turbo speedgoogle/gemini-pro-1.5- Long context (up to 1M tokens)meta-llama/llama-3.1-70b-instruct:free- Free Llama 3 modelmistralai/mistral-large- Strong multilingual capabilities
How It Works
Direct Connection
When you set ANTHROPIC_BASE_URL to https://openrouter.ai/api, Claude Code speaks its native protocol directly to OpenRouter. No local proxy server is required.
Anthropic Skin
OpenRouter's "Anthropic Skin" behaves exactly like the Anthropic API. It handles model mapping and passes through advanced features like "Thinking" blocks and native tool use.
Model Routing
When you specify a model name from the OpenRouter catalog, OpenRouter routes your request to the appropriate provider automatically, handling all API differences transparently.
Billing
You are billed using your OpenRouter credits. Usage (including reasoning tokens) appears in your OpenRouter dashboard.
Security
The skill handles your API key securely and stores it in your shell environment with appropriate file permissions.
Privacy Note: OpenRouter does not log your source code prompts unless you explicitly opt-in to prompt logging in your account settings. See their Privacy Policy for details.
Changing Models
You can change your model at any time by:
Option 1: Update Claude Code Settings
Edit ~/.claude/settings.json:
{
"provider": {
"baseUrl": "https://openrouter.ai/api",
"apiKey": "${ANTHROPIC_AUTH_TOKEN}",
"defaultModel": "new-model-name"
}
}
Option 2: Use the Skill Again
Run the configuration process again and select a different model when prompted.
Option 3: Per-Session Model Selection
In Claude Code, you can sometimes specify models inline:
/model openrouter:your-model-name
Advanced Features
Cost Tracking Statusline
You can add a custom statusline to Claude Code that tracks your OpenRouter API costs in real-time. The statusline displays provider, model, cumulative cost, and cache discounts for your session.
Download the statusline scripts from the openrouter-examples repository, make them executable, and add the following to your ~/.claude/settings.json:
{
"statusLine": {
"type": "command",
"command": "/path/to/statusline.sh"
}
}
GitHub Action Integration
You can use OpenRouter with the official Claude Code GitHub Action. To adapt the example workflow for OpenRouter, make two changes to the action step:
- Pass your OpenRouter API key via
anthropic_api_key(store it as a GitHub secret namedOPENROUTER_API_KEY) - Set the
ANTHROPIC_BASE_URLenvironment variable tohttps://openrouter.ai/api
Example:
- name: Run Claude Code
uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.OPENROUTER_API_KEY }}
env:
ANTHROPIC_BASE_URL: https://openrouter.ai/api
Agent SDK Integration
The Anthropic Agent SDK lets you build AI agents programmatically using Python or TypeScript. Since the Agent SDK uses Claude Code as its runtime, you can connect it to OpenRouter using the same environment variables described above.
Troubleshooting
Model Not Found Errors
- Verify the model name is copied exactly from OpenRouter's model catalog
- Check that the model is currently available on OpenRouter
- Some models may have different naming conventions (e.g., with
:free,:betasuffixes)
Auth Errors
- Ensure
ANTHROPIC_API_KEYis set to an empty string ("") - If it is unset (null), Claude Code might fall back to its default behavior and try to authenticate with Anthropic servers
Context Length Errors
- If you hit context limits, consider switching to a model with larger context window
- Break your task into smaller chunks or start a new session
- Check the model's context limit at https://openrouter.ai/models
Previous Anthropic Login
- If you were previously logged in to Claude Code with Anthropic, run
/logoutin a Claude Code session to clear cached credentials before the OpenRouter configuration takes effect
Connection Issues
- Verify your OpenRouter API key is correct
- Check that environment variables are set in your shell profile (not just in current session)
- Ensure you've restarted your terminal after updating your shell profile
- Run
/statusto verify configuration
Related Skills
- API Key Management
- Provider Configuration
- Model Selection
- Cost Tracking
Example Commands
- "Configure OpenRouter with my API key"
- "Set up OpenRouter as my AI provider for Claude Code"
- "Enable OpenRouter in Claude Code"
- "Configure OpenRouter with Claude 3.5 Sonnet"
- "I want to use GPT-4 through OpenRouter"
- "Help me select and configure an OpenRouter model"
Notes
- This skill is designed specifically for Claude Code
- It requires write access to your shell profile (~/.bashrc, ~/.zshrc, or ~/.config/fish/config.fish)
- The skill ensures OpenRouter is used exclusively with Anthropic 1P as top priority
- You can browse and select from 100+ models at https://openrouter.ai/models
- You can change your model at any time by editing
~/.claude/settings.jsonor running this skill again - You can revert to using Anthropic directly by removing these environment variables and running
/logout - Always refer to the official OpenRouter documentation for the most up-to-date information