Files
ClaudeCode-Custom-Skills/skills/openrouter-config/SKILL.md
Z User 1e6bd46a68 Remove outdated specific models, point to live catalog
- Remove all hardcoded model name examples from documentation
- Replace outdated model table with live catalog guidance
- Add comprehensive list of major providers (current as of 2025)
- Highlight catalog features: filters, pricing, context length, etc.
- Update example commands to remove specific model references
- Emphasize always checking https://openrouter.ai/models for current models
- Add note that models are added/updated regularly
- Keep documentation future-proof by referencing live catalog
2026-02-22 13:16:03 +00:00

12 KiB

name, description, version
name description version
openrouter-config Use this skill when the user asks to "configure OpenRouter", "set up OpenRouter API key", "enable OpenRouter as AI provider", "configure OpenRouter in Claude Code", "select an OpenRouter model", "use a specific model with OpenRouter", "connect Claude Code to OpenRouter", or mentions configuring OpenRouter for Claude Code. 1.0.0

OpenRouter Config Skill

Helps users configure and enable OpenRouter as the AI provider for Claude Code by letting them select their preferred model from the OpenRouter catalog, setting up the API key, configuring Anthropic-compatible environment variables, and ensuring proper provider priority and failover capabilities.

Documentation Source

Official OpenRouter Claude Code Integration Guide: https://openrouter.ai/docs/guides/claude-code-integration

What It Does

  1. Model Selection: Prompts user to choose a model from OpenRouter's 100+ model catalog
  2. Model Catalog Guidance: Provides link to https://openrouter.ai/models and instructions to copy model name
  3. API Key Setup: Securely configures OpenRouter API key
  4. Environment Configuration: Sets up Anthropic-compatible environment variables
  5. Model Configuration: Stores chosen model in Claude Code settings
  6. Provider Priority: Ensures Anthropic 1P is set as top priority provider
  7. API Endpoint Configuration: Configures https://openrouter.ai/api as the base URL
  8. Configuration Verification: Validates that the configuration is working correctly
  9. Failover Setup: Enables automatic provider failover for high availability

Key Benefits

1. Provider Failover for High Availability

Anthropic's API occasionally experiences outages or rate limiting. OpenRouter automatically fails over between multiple Anthropic providers, keeping coding sessions uninterrupted.

2. Access to 100+ Models

Choose from models by Anthropic, OpenAI, Google, Meta, Mistral, xAI, and many more providers. Browse the complete catalog at https://openrouter.ai/models

3. Organizational Budget Controls

Centralized budget management with spending limits, credit allocation, and cost overrun prevention for teams.

4. Usage Visibility and Analytics

Complete visibility into Claude Code usage across teams, including usage patterns, real-time cost monitoring, and resource consumption tracking via the OpenRouter Activity Dashboard.

Model Selection Process

Step 1: Ask for Model Preference

When the skill starts, it will ask:

"Which OpenRouter model would you like to use with Claude Code?"

Step 2: Provide Model Catalog

The skill will provide a link to browse available models:

Step 3: Guide User Selection

The skill will instruct the user to:

  1. Visit https://openrouter.ai/models
  2. Browse through the available models
  3. Click on a model to see details (pricing, context length, features)
  4. Copy the model name (click the model name to copy it)
  5. Paste the model name back into the chat

Step 4: Store Model Configuration

The skill will configure the chosen model in Claude Code settings at ~/.claude/settings.json:

{
  "provider": {
    "baseUrl": "https://openrouter.ai/api",
    "apiKey": "${ANTHROPIC_AUTH_TOKEN}",
    "defaultModel": "user-selected-model-name"
  }
}

Model Selection Guidance

Always use the live model catalog: https://openrouter.ai/models

The OpenRouter catalog shows:

  • All current models from every provider
  • Real-time pricing for input/output tokens
  • Context limits for each model
  • Available features: vision, function calling, free tier, etc.
  • Provider information: Which provider hosts each model

How to find and select a model:

  1. Visit https://openrouter.ai/models
  2. Use filters to narrow down by provider, features, price range
  3. Click on any model to see detailed information
  4. Click the model name to copy it to clipboard
  5. Paste the exact model name when the skill prompts you

The catalog is always current - models are added and updated regularly, so checking the live catalog ensures you see the latest available models.

Quick Start

Environment Variables to Configure

export OPENROUTER_API_KEY="<your-openrouter-api-key>"
export ANTHROPIC_BASE_URL="https://openrouter.ai/api"
export ANTHROPIC_AUTH_TOKEN="$OPENROUTER_API_KEY"
export ANTHROPIC_API_KEY="" # Important: Must be explicitly empty

Shell Profile Locations

Add to one of these files:

  • ~/.bashrc for Bash
  • ~/.zshrc for Zsh
  • ~/.config/fish/config.fish for Fish

Important: Do NOT add these to a project-level .env file - Claude Code's native installer does not read standard .env files.

Verification

Run /status inside Claude Code to verify:

/status

Expected output:

Auth token: ANTHROPIC_AUTH_TOKEN
Anthropic base URL: https://openrouter.ai/api

API Configuration

Base URL

https://openrouter.ai/api

Authentication

OpenRouter API key (get from https://openrouter.ai/keys)

Model Selection

  • Browse models at: https://openrouter.ai/models
  • Click on any model to see details and copy the model name
  • Paste the model name when prompted by the skill
  • Model is stored in ~/.claude/settings.json as the default model

Provider Priority

Critical: Set Anthropic 1P as the top priority provider for maximum compatibility with Claude Code.

Usage

"Configure OpenRouter with my API key"
"Set up OpenRouter as my AI provider"
"Enable OpenRouter in Claude Code"
"Configure OpenRouter for Claude Code"
"Connect Claude Code to OpenRouter"
"Help me select and configure an OpenRouter model"
"Show me the available models on OpenRouter"
"I want to browse OpenRouter models and configure one"

How It Works

Direct Connection

Claude Code speaks its native protocol directly to OpenRouter when ANTHROPIC_BASE_URL is set to https://openrouter.ai/api. No local proxy server is required.

Anthropic Skin

OpenRouter's "Anthropic Skin" behaves exactly like the Anthropic API, handling model mapping and passing through advanced features like "Thinking" blocks and native tool use.

Model Routing

When a user selects a model from the OpenRouter catalog, OpenRouter routes all requests to that specific model, handling provider differences transparently.

Billing

Billed using OpenRouter credits. Usage (including reasoning tokens) appears in the OpenRouter dashboard.

Important Notes

Previous Anthropic Login

If previously logged in to Claude Code with Anthropic, run /logout in a Claude Code session to clear cached credentials before OpenRouter configuration takes effect.

Explicit Empty API Key

ANTHROPIC_API_KEY must be set to an empty string (""), not unset. If unset (null), Claude Code may fall back to default behavior and authenticate with Anthropic servers directly.

Shell Profile Persistence

Environment variables should be added to the user's shell profile for persistence across sessions.

Model Name Accuracy

Model names must be copied exactly from OpenRouter's model catalog (https://openrouter.ai/models). Include any suffixes like :free, :beta, etc.

Claude Code Models

When using Claude Code through OpenRouter, users can access any of the 100+ models available, including Anthropic's Claude 3.5 Sonnet, Claude 3 Opus, Claude 3 Haiku, as well as models from OpenAI, Google, Meta, Mistral, and more.

Advanced Features

Cost Tracking Statusline

Download statusline scripts from openrouter-examples repository and add to ~/.claude/settings.json:

{
  "statusLine": {
    "type": "command",
    "command": "/path/to/statusline.sh"
  }
}

GitHub Action Integration

- name: Run Claude Code
  uses: anthropics/claude-code-action@v1
  with:
    anthropic_api_key: ${{ secrets.OPENROUTER_API_KEY }}
  env:
    ANTHROPIC_BASE_URL: https://openrouter.ai/api

Agent SDK Integration

The Anthropic Agent SDK uses the same environment variables for OpenRouter integration.

Security Notes

  • API keys are stored in shell environment with appropriate file permissions
  • OpenRouter does not log source code prompts unless you explicitly opt-in to prompt logging
  • See OpenRouter Privacy Policy for details

Troubleshooting

Model Not Found Errors

  • Verify the model name is copied exactly from OpenRouter's model catalog
  • Check that the model is currently available
  • Some models may have different naming conventions (e.g., with :free, :beta suffixes)
  • Check https://openrouter.ai/models for current availability

Auth Errors

  • Ensure ANTHROPIC_API_KEY is set to empty string ("")
  • Check that environment variables are set in shell profile (not just current session)
  • Restart terminal after updating shell profile

Context Length Errors

  • Switch to a model with larger context window
  • Check model context limits at https://openrouter.ai/models
  • Break tasks into smaller chunks
  • Start a new session

Connection Issues

  • Verify OpenRouter API key is correct
  • Check all environment variables are set correctly
  • Run /status to verify configuration
  • Check OpenRouter Activity Dashboard for requests appearing

Previous Credentials

  • Run /logout to clear cached Anthropic credentials
  • Restart Claude Code after environment variable changes

Example Workflow

  1. User requests: "Configure OpenRouter for Claude Code"
  2. Skill asks: "Which OpenRouter model would you like to use?"
  3. Skill provides link: https://openrouter.ai/models
  4. User browses models, clicks on one, and copies the model name
  5. User pastes model name back to the skill
  6. Skill prompts for OpenRouter API key
  7. Skill adds environment variables to shell profile
  8. Skill configures chosen model in ~/.claude/settings.json
  9. Skill reminds user to restart terminal
  10. User restarts terminal and runs claude
  11. User runs /status to verify configuration
  12. User can now use Claude Code with their chosen OpenRouter model

Changing Models

Users can change their model at any time by:

Option 1: Update Claude Code Settings

Edit ~/.claude/settings.json:

{
  "provider": {
    "baseUrl": "https://openrouter.ai/api",
    "apiKey": "${ANTHROPIC_AUTH_TOKEN}",
    "defaultModel": "new-model-name"
  }
}

Option 2: Run the Skill Again

Re-run the configuration process and select a different model when prompted.

Configuration File Location

Environment variables are added to shell profile:

  • ~/.bashrc, ~/.zshrc, or ~/.config/fish/config.fish

Claude Code settings (includes model configuration):

  • ~/.claude/settings.json

Model Catalog

The complete OpenRouter model catalog is available at:

Users can filter by:

  • Provider (Anthropic, OpenAI, Google, Meta, etc.)
  • Features (function calling, vision, etc.)
  • Context length
  • Price range
  • Free tier availability

Supported Models

OpenRouter provides 100+ models from leading AI providers. The catalog is constantly updated with new models.

Always check the live catalog for the most current models:

Major Providers Available:

  • Anthropic: Latest Claude models
  • OpenAI: GPT-4 series, o1, o3
  • Google: Gemini models
  • Meta: Llama family
  • Mistral: Mistral Large, Pixtral
  • DeepSeek: DeepSeek Chat, R1
  • Qwen: Qwen 2.5, Qwen Coder
  • Perplexity: Sonar models with web search
  • And many more: xAI, Cohere, etc.

Model Features You Can Filter By:

  • Free tier availability
  • Vision/multimodal capabilities
  • Function calling/tool use
  • Long context windows (up to 1M+ tokens)
  • Coding specialization
  • Web search integration
  • Price range

Note: Models are added and updated regularly. Always visit https://openrouter.ai/models to see the complete, up-to-date catalog with real-time pricing and availability.