- Ask user which OpenRouter model they want to use - Provide link to OpenRouter model catalog (https://openrouter.ai/models) - Guide users to browse, click, and copy model name - Store selected model in ~/.claude/settings.json - Add popular model recommendations table - Document model change process (3 options) - Add model-specific troubleshooting (model not found, context length) - Expand supported models section with examples from multiple providers - Include free tier models (meta-llama with :free suffix) - Add model name accuracy notes (suffixes like :beta, :free) - Update example commands to include model selection scenarios
12 KiB
name, description, version
| name | description | version |
|---|---|---|
| openrouter-config | Use this skill when the user asks to "configure OpenRouter", "set up OpenRouter API key", "enable OpenRouter as AI provider", "configure OpenRouter in Claude Code", "select an OpenRouter model", "use a specific model with OpenRouter", "connect Claude Code to OpenRouter", or mentions configuring OpenRouter for Claude Code. | 1.0.0 |
OpenRouter Config Skill
Helps users configure and enable OpenRouter as the AI provider for Claude Code by letting them select their preferred model from the OpenRouter catalog, setting up the API key, configuring Anthropic-compatible environment variables, and ensuring proper provider priority and failover capabilities.
Documentation Source
Official OpenRouter Claude Code Integration Guide: https://openrouter.ai/docs/guides/claude-code-integration
What It Does
- Model Selection: Prompts user to choose a model from OpenRouter's 100+ model catalog
- Model Catalog Guidance: Provides link to https://openrouter.ai/models and instructions to copy model name
- API Key Setup: Securely configures OpenRouter API key
- Environment Configuration: Sets up Anthropic-compatible environment variables
- Model Configuration: Stores chosen model in Claude Code settings
- Provider Priority: Ensures Anthropic 1P is set as top priority provider
- API Endpoint Configuration: Configures
https://openrouter.ai/apias the base URL - Configuration Verification: Validates that the configuration is working correctly
- Failover Setup: Enables automatic provider failover for high availability
Key Benefits
1. Provider Failover for High Availability
Anthropic's API occasionally experiences outages or rate limiting. OpenRouter automatically fails over between multiple Anthropic providers, keeping coding sessions uninterrupted.
2. Access to 100+ Models
Choose from models by Anthropic, OpenAI, Google, Meta, Mistral, xAI, and many more providers. Browse the complete catalog at https://openrouter.ai/models
3. Organizational Budget Controls
Centralized budget management with spending limits, credit allocation, and cost overrun prevention for teams.
4. Usage Visibility and Analytics
Complete visibility into Claude Code usage across teams, including usage patterns, real-time cost monitoring, and resource consumption tracking via the OpenRouter Activity Dashboard.
Model Selection Process
Step 1: Ask for Model Preference
When the skill starts, it will ask:
"Which OpenRouter model would you like to use with Claude Code?"
Step 2: Provide Model Catalog
The skill will provide a link to browse available models:
- OpenRouter Models: https://openrouter.ai/models
Step 3: Guide User Selection
The skill will instruct the user to:
- Visit https://openrouter.ai/models
- Browse through the available models
- Click on a model to see details (pricing, context length, features)
- Copy the model name (click the model name to copy it)
- Paste the model name back into the chat
Step 4: Store Model Configuration
The skill will configure the chosen model in Claude Code settings at ~/.claude/settings.json:
{
"provider": {
"baseUrl": "https://openrouter.ai/api",
"apiKey": "${ANTHROPIC_AUTH_TOKEN}",
"defaultModel": "user-selected-model-name"
}
}
Popular Model Recommendations
| Model | Provider | Best For | Price Level |
|---|---|---|---|
anthropic/claude-3.5-sonnet |
Anthropic | General coding, reasoning | High |
anthropic/claude-3.5-sonnet:beta |
Anthropic | Latest Claude features | High |
openai/gpt-4-turbo |
OpenAI | Fast, cost-effective | Medium-High |
google/gemini-pro-1.5 |
Long context tasks | Medium | |
meta-llama/llama-3.1-70b-instruct:free |
Meta | Free tier usage | Free |
mistralai/mistral-large |
Mistral | Multilingual tasks | Medium |
Quick Start
Environment Variables to Configure
export OPENROUTER_API_KEY="<your-openrouter-api-key>"
export ANTHROPIC_BASE_URL="https://openrouter.ai/api"
export ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY"
export ANTHROPIC_API_KEY="" # Important: Must be explicitly empty
Shell Profile Locations
Add to one of these files:
~/.bashrcfor Bash~/.zshrcfor Zsh~/.config/fish/config.fishfor Fish
Important: Do NOT add these to a project-level .env file - Claude Code's native installer does not read standard .env files.
Verification
Run /status inside Claude Code to verify:
/status
Expected output:
Auth token: ANTHROPIC_AUTH_TOKEN
Anthropic base URL: https://openrouter.ai/api
API Configuration
Base URL
https://openrouter.ai/api
Authentication
OpenRouter API key (get from https://openrouter.ai/keys)
Model Selection
- Browse models at: https://openrouter.ai/models
- Click on any model to see details and copy the model name
- Paste the model name when prompted by the skill
- Model is stored in
~/.claude/settings.jsonas the default model
Provider Priority
Critical: Set Anthropic 1P as the top priority provider for maximum compatibility with Claude Code.
Usage
"Configure OpenRouter with my API key"
"Set up OpenRouter as my AI provider"
"Enable OpenRouter in Claude Code"
"Configure OpenRouter for Claude Code"
"Connect Claude Code to OpenRouter"
"I want to use Claude 3.5 Sonnet through OpenRouter"
"Help me select and configure an OpenRouter model"
"Set up OpenRouter with GPT-4"
How It Works
Direct Connection
Claude Code speaks its native protocol directly to OpenRouter when ANTHROPIC_BASE_URL is set to https://openrouter.ai/api. No local proxy server is required.
Anthropic Skin
OpenRouter's "Anthropic Skin" behaves exactly like the Anthropic API, handling model mapping and passing through advanced features like "Thinking" blocks and native tool use.
Model Routing
When a user selects a model from the OpenRouter catalog, OpenRouter routes all requests to that specific model, handling provider differences transparently.
Billing
Billed using OpenRouter credits. Usage (including reasoning tokens) appears in the OpenRouter dashboard.
Important Notes
Previous Anthropic Login
If previously logged in to Claude Code with Anthropic, run /logout in a Claude Code session to clear cached credentials before OpenRouter configuration takes effect.
Explicit Empty API Key
ANTHROPIC_API_KEY must be set to an empty string (""), not unset. If unset (null), Claude Code may fall back to default behavior and authenticate with Anthropic servers directly.
Shell Profile Persistence
Environment variables should be added to the user's shell profile for persistence across sessions.
Model Name Accuracy
Model names must be copied exactly from OpenRouter's model catalog (https://openrouter.ai/models). Include any suffixes like :free, :beta, etc.
Claude Code Models
When using Claude Code through OpenRouter, users can access any of the 100+ models available, including Anthropic's Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku, as well as models from OpenAI, Google, Meta, Mistral, and more.
Advanced Features
Cost Tracking Statusline
Download statusline scripts from openrouter-examples repository and add to ~/.claude/settings.json:
{
"statusLine": {
"type": "command",
"command": "/path/to/statusline.sh"
}
}
GitHub Action Integration
- name: Run Claude Code
uses: anthropics/claude-code-action@v1
with:
anthropic_api_key: ${{ secrets.OPENROUTER_API_KEY }}
env:
ANTHROPIC_BASE_URL: https://openrouter.ai/api
Agent SDK Integration
The Anthropic Agent SDK uses the same environment variables for OpenRouter integration.
Security Notes
- API keys are stored in shell environment with appropriate file permissions
- OpenRouter does not log source code prompts unless you explicitly opt-in to prompt logging
- See OpenRouter Privacy Policy for details
Troubleshooting
Model Not Found Errors
- Verify the model name is copied exactly from OpenRouter's model catalog
- Check that the model is currently available
- Some models may have different naming conventions (e.g., with
:free,:betasuffixes) - Check https://openrouter.ai/models for current availability
Auth Errors
- Ensure
ANTHROPIC_API_KEYis set to empty string ("") - Check that environment variables are set in shell profile (not just current session)
- Restart terminal after updating shell profile
Context Length Errors
- Switch to a model with larger context window
- Check model context limits at https://openrouter.ai/models
- Break tasks into smaller chunks
- Start a new session
Connection Issues
- Verify OpenRouter API key is correct
- Check all environment variables are set correctly
- Run
/statusto verify configuration - Check OpenRouter Activity Dashboard for requests appearing
Previous Credentials
- Run
/logoutto clear cached Anthropic credentials - Restart Claude Code after environment variable changes
Example Workflow
- User requests: "Configure OpenRouter for Claude Code"
- Skill asks: "Which OpenRouter model would you like to use?"
- Skill provides link: https://openrouter.ai/models
- User browses models, clicks on one, and copies the model name
- User pastes model name back to the skill
- Skill prompts for OpenRouter API key
- Skill adds environment variables to shell profile
- Skill configures chosen model in
~/.claude/settings.json - Skill reminds user to restart terminal
- User restarts terminal and runs
claude - User runs
/statusto verify configuration - User can now use Claude Code with their chosen OpenRouter model
Changing Models
Users can change their model at any time by:
Option 1: Update Claude Code Settings
Edit ~/.claude/settings.json:
{
"provider": {
"baseUrl": "https://openrouter.ai/api",
"apiKey": "${ANTHROPIC_AUTH_TOKEN}",
"defaultModel": "new-model-name"
}
}
Option 2: Run the Skill Again
Re-run the configuration process and select a different model when prompted.
Configuration File Location
Environment variables are added to shell profile:
~/.bashrc,~/.zshrc, or~/.config/fish/config.fish
Claude Code settings (includes model configuration):
~/.claude/settings.json
Model Catalog
The complete OpenRouter model catalog is available at:
- Browse Models: https://openrouter.ai/models
Users can filter by:
- Provider (Anthropic, OpenAI, Google, Meta, etc.)
- Features (function calling, vision, etc.)
- Context length
- Price range
- Free tier availability
Supported Models
OpenRouter supports 100+ models from various providers. Users browse and select their preferred model at https://openrouter.ai/models
Popular Anthropic Models:
anthropic/claude-3.5-sonnet- Latest, most capableanthropic/claude-3-opus- Best for complex tasksanthropic/claude-3-haiku- Fast, cost-effective
Popular Alternative Models:
openai/gpt-4-turbo- GPT-4 with Turbo speedgoogle/gemini-pro-1.5- Long context (up to 1M tokens)meta-llama/llama-3.1-70b-instruct:free- Free Llama 3 modelmistralai/mistral-large- Strong multilingual capabilities