- Remove all hardcoded model name examples from documentation - Replace outdated model table with live catalog guidance - Add comprehensive list of major providers (current as of 2025) - Highlight catalog features: filters, pricing, context length, etc. - Update example commands to remove specific model references - Emphasize always checking https://openrouter.ai/models for current models - Add note that models are added/updated regularly - Keep documentation future-proof by referencing live catalog
OpenRouter Config Skill
Description
This skill helps users configure and enable OpenRouter as the AI provider in Claude Code. It allows you to select your preferred OpenRouter model, sets up the API key, configures Anthropic-compatible API endpoints, and ensures Claude Code uses OpenRouter exclusively with proper failover capabilities.
Documentation Source
Based on official OpenRouter documentation: https://openrouter.ai/docs/guides/claude-code-integration
Why Use OpenRouter with Claude Code?
1. Provider Failover for High Availability
Anthropic's API occasionally experiences outages or rate limiting. When you route Claude Code through OpenRouter, your requests automatically fail over between multiple Anthropic providers. If one provider is unavailable or rate-limited, OpenRouter seamlessly routes to another, keeping your coding sessions uninterrupted.
2. Access to 100+ Models
OpenRouter gives you access to models from multiple providers including:
- Anthropic: Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku
- OpenAI: GPT-4, GPT-4 Turbo, GPT-3.5
- Google: Gemini Pro, Gemini Flash
- Meta: Llama 2, Llama 3
- Mistral: Mixtral, Mistral Large
- And many more: xAI, Perplexity, Cohere, etc.
3. Organizational Budget Controls
For teams and organizations, OpenRouter provides centralized budget management. You can set spending limits, allocate credits across team members, and prevent unexpected cost overruns.
4. Usage Visibility and Analytics
OpenRouter gives you complete visibility into how Claude Code is being used across your team. Track usage patterns, monitor costs in real-time, and understand which projects or team members are consuming the most resources. All of this data is available in your OpenRouter Activity Dashboard.
Usage
To use this skill, simply ask Claude Code to configure OpenRouter. The skill will:
- Ask you which OpenRouter model you want to use
- Provide a link to browse available models
- Guide you to copy the model name from OpenRouter's model catalog
- Set up your OpenRouter API key
- Configure Anthropic-compatible environment variables
- Ensure Claude Code uses OpenRouter exclusively
- Set up proper provider priority
- Verify the configuration is working
Prerequisites
- An OpenRouter API key (get one from https://openrouter.ai/keys)
- Claude Code installed
Model Selection
Step 1: Browse Available Models
Visit the OpenRouter models catalog to choose your preferred model:
- OpenRouter Models: https://openrouter.ai/models
Step 2: Find and Copy Your Model
Browse through the available models and click on any model to see:
- Model name (click to copy)
- Pricing (input/output tokens)
- Context length
- Features (function calling, vision, etc.)
- Provider information
Model Selection Tips:
Visit the live model catalog: https://openrouter.ai/models
On the model catalog page, you can:
- Filter by provider: Anthropic, OpenAI, Google, Meta, Mistral, DeepSeek, Qwen, Perplexity, and more
- Compare pricing: See input/output token costs for each model
- Check context length: View max tokens supported
- Filter by features: Function calling, vision, free tier, etc.
- View popularity: See what other developers are using
- Click to copy: Easily copy the exact model ID you want to use
To select a model:
- Visit https://openrouter.ai/models
- Browse and filter models by your needs (provider, features, price, etc.)
- Click on any model to see detailed information
- Click the model name to copy it to your clipboard
- Paste the model name when prompted by this skill
Step 3: Configure Your Chosen Model
The skill will ask you to paste the model name you selected. You can also change the model later by editing your Claude Code settings.
Configuration Steps
Step 1: Install Claude Code (if not already installed)
# macOS, Linux, WSL:
curl -fsSL https://claude.ai/install.sh | bash
# Windows PowerShell:
irm https://claude.ai/install.ps1 | iex
Step 2: Configure Environment Variables
The skill will help you set these environment variables in your shell profile:
# Add to ~/.zshrc (or ~/.bashrc for Bash, ~/.config/fish/config.fish for Fish)
export OPENROUTER_API_KEY="<your-openrouter-api-key>"
export ANTHROPIC_BASE_URL="https://openrouter.ai/api"
export ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY"
export ANTHROPIC_API_KEY="" # Important: Must be explicitly empty
Important Notes:
- Do NOT put these in a project-level
.envfile - the native Claude Code installer does not read standard.envfiles - Must explicitly blank out
ANTHROPIC_API_KEYto prevent conflicts - If previously logged in to Claude Code with Anthropic, run
/logoutin a Claude Code session to clear cached credentials
Step 3: Configure Your Model
The skill will add your chosen model to your Claude Code settings at ~/.claude/settings.json:
{
"provider": {
"baseUrl": "https://openrouter.ai/api",
"apiKey": "${ANTHROPIC_AUTH_TOKEN}",
"defaultModel": "your-chosen-model-name"
}
}
Step 4: Start Claude Code
cd /path/to/your/project
claude
Step 5: Verify Configuration
Run the /status command inside Claude Code:
/status
Expected output:
Auth token: ANTHROPIC_AUTH_TOKEN
Anthropic base URL: https://openrouter.ai/api
You can also check the OpenRouter Activity Dashboard to see your requests appearing in real-time.
API Configuration
Base URL
https://openrouter.ai/api
Authentication
Use your OpenRouter API key as the auth token.
Model Selection
Choose your model from the OpenRouter model catalog:
- Browse Models: https://openrouter.ai/models
- Click on any model to see details and copy the model name
- Paste the model name when prompted by this skill
Provider Priority
Important: Claude Code with OpenRouter is only guaranteed to work with the Anthropic first-party provider. For maximum compatibility, we recommend setting Anthropic 1P as the top priority provider when using Claude Code.
Supported Models
OpenRouter provides 100+ models from leading AI providers. The catalog is constantly updated with new models.
Always check the live catalog for the most current models:
- OpenRouter Models: https://openrouter.ai/models
Major Providers Available:
- Anthropic: Latest Claude models
- OpenAI: GPT-4 series, o1, o3
- Google: Gemini models
- Meta: Llama family
- Mistral: Mistral Large, Pixtral
- DeepSeek: DeepSeek Chat, R1
- Qwen: Qwen 2.5, Qwen Coder
- Perplexity: Sonar models with web search
- And many more: xAI, Cohere, etc.
Model Features You Can Filter By:
- ✅ Free tier availability
- ✅ Vision/multimodal capabilities
- ✅ Function calling/tool use
- ✅ Long context windows (up to 1M+ tokens)
- ✅ Coding specialization
- ✅ Web search integration
- ✅ Price range
Note: Models are added and updated regularly. Always visit https://openrouter.ai/models to see the complete, up-to-date catalog with real-time pricing and availability.
How It Works
Direct Connection
When you set ANTHROPIC_BASE_URL to https://openrouter.ai/api, Claude Code speaks its native protocol directly to OpenRouter. No local proxy server is required.
Anthropic Skin
OpenRouter's "Anthropic Skin" behaves exactly like the Anthropic API. It handles model mapping and passes through advanced features like "Thinking" blocks and native tool use.
Model Routing
When you specify a model name from the OpenRouter catalog, OpenRouter routes your request to the appropriate provider automatically, handling all API differences transparently.
Billing
You are billed using your OpenRouter credits. Usage (including reasoning tokens) appears in your OpenRouter dashboard.
Security
The skill handles your API key securely and stores it in your shell environment with appropriate file permissions.
Privacy Note: OpenRouter does not log your source code prompts unless you explicitly opt-in to prompt logging in your account settings. See their Privacy Policy for details.
Changing Models
You can change your model at any time by:
Option 1: Update Claude Code Settings
Edit ~/.claude/settings.json:
{
"provider": {
"baseUrl": "https://openrouter.ai/api",
"apiKey": "${ANTHROPIC_AUTH_TOKEN}",
"defaultModel": "new-model-name"
}
}
Option 2: Use the Skill Again
Run the configuration process again and select a different model when prompted.
Option 3: Per-Session Model Selection
In Claude Code, you can sometimes specify models inline:
/model openrouter:your-model-name
Advanced Features
Cost Tracking Statusline
You can add a custom statusline to Claude Code that tracks your OpenRouter API costs in real-time. The statusline displays provider, model, cumulative cost, and cache discounts for your session.
Download the statusline scripts from the openrouter-examples repository, make them executable, and add the following to your ~/.claude/settings.json:
{
"statusLine": {
"type": "command",
"command": "/path/to/statusline.sh"
}
}
GitHub Action Integration
You can use OpenRouter with the official Claude Code GitHub Action. To adapt the example workflow for OpenRouter, make two changes to the action step:
- Pass your OpenRouter API key via
anthropic_api_key(store it as a GitHub secret namedOPENROUTER_API_KEY) - Set the
ANTHROPIC_BASE_URLenvironment variable tohttps://openrouter.ai/api
Example:
- name: Run Claude Code
uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.OPENROUTER_API_KEY }}
env:
ANTHROPIC_BASE_URL: https://openrouter.ai/api
Agent SDK Integration
The Anthropic Agent SDK lets you build AI agents programmatically using Python or TypeScript. Since the Agent SDK uses Claude Code as its runtime, you can connect it to OpenRouter using the same environment variables described above.
Troubleshooting
Model Not Found Errors
- Verify the model name is copied exactly from OpenRouter's model catalog
- Check that the model is currently available on OpenRouter
- Some models may have different naming conventions (e.g., with
:free,:betasuffixes)
Auth Errors
- Ensure
ANTHROPIC_API_KEYis set to an empty string ("") - If it is unset (null), Claude Code might fall back to its default behavior and try to authenticate with Anthropic servers
Context Length Errors
- If you hit context limits, consider switching to a model with larger context window
- Break your task into smaller chunks or start a new session
- Check the model's context limit at https://openrouter.ai/models
Previous Anthropic Login
- If you were previously logged in to Claude Code with Anthropic, run
/logoutin a Claude Code session to clear cached credentials before the OpenRouter configuration takes effect
Connection Issues
- Verify your OpenRouter API key is correct
- Check that environment variables are set in your shell profile (not just in current session)
- Ensure you've restarted your terminal after updating your shell profile
- Run
/statusto verify configuration
Related Skills
- API Key Management
- Provider Configuration
- Model Selection
- Cost Tracking
Example Commands
- "Configure OpenRouter with my API key"
- "Set up OpenRouter as my AI provider for Claude Code"
- "Enable OpenRouter in Claude Code"
- "Configure OpenRouter for Claude Code"
- "Help me select and configure an OpenRouter model"
- "Show me the available models on OpenRouter"
- "I want to browse OpenRouter models and pick one"
Notes
- This skill is designed specifically for Claude Code
- It requires write access to your shell profile (~/.bashrc, ~/.zshrc, or ~/.config/fish/config.fish)
- The skill ensures OpenRouter is used exclusively with Anthropic 1P as top priority
- You can browse and select from 100+ models at https://openrouter.ai/models
- You can change your model at any time by editing
~/.claude/settings.jsonor running this skill again - You can revert to using Anthropic directly by removing these environment variables and running
/logout - Always refer to the official OpenRouter documentation for the most up-to-date information