Files
ClaudeCode-Custom-Skills/skills/claw-setup/SKILL.md
Claude Code baffcf6db1 feat: Add all 25+ OpenCode-compatible AI providers to Claw Setup
Updated provider support to match OpenCode's full provider list:

Built-in Providers (18):
- Anthropic, OpenAI, Azure OpenAI
- Google AI, Google Vertex AI
- Amazon Bedrock
- OpenRouter, xAI, Mistral
- Groq, Cerebras, DeepInfra
- Cohere, Together AI, Perplexity
- Vercel AI, GitLab, GitHub Copilot

Custom Loader Providers:
- GitHub Copilot Enterprise
- Google Vertex Anthropic
- Azure Cognitive Services
- Cloudflare AI Gateway
- SAP AI Core

Local/Self-Hosted:
- Ollama, LM Studio, vLLM

Features:
- Model fetching from provider APIs
- Custom model input support
- Multi-provider configuration
- Environment variable security

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
2026-02-22 03:51:55 -05:00

9.0 KiB

name, description, version
name description version
claw-setup Use this skill when the user asks to "setup openclaw", "install nanobot", "deploy zeroclaw", "configure picoclaw", "AI agent setup", "personal AI assistant", "claw framework", or mentions setting up any AI agent/assistant platform from the Claw family (OpenClaw, NanoBot, PicoClaw, ZeroClaw, NanoClaw). 1.0.0

Claw Setup Skill

End-to-end professional setup of AI Agent platforms from the Claw family with security hardening, multi-provider configuration, and personal customization through interactive brainstorming.

Supported Platforms

Platform Language Memory Startup Best For
OpenClaw TypeScript >1GB ~500s Full-featured, plugin ecosystem
NanoBot Python ~100MB ~30s Research, easy customization
PicoClaw Go <10MB ~1s Low-resource, embedded
ZeroClaw Rust <5MB <10ms Maximum performance, security
NanoClaw TypeScript ~50MB ~5s WhatsApp integration

AI Providers (OpenCode Compatible - 25+ Providers)

Built-in Providers

Provider SDK Package Key Models Features
Anthropic @ai-sdk/anthropic Claude 3.5/4/Opus Extended thinking, PDF support
OpenAI @ai-sdk/openai GPT-4o, o1, o3, GPT-5 Function calling, structured output
Azure OpenAI @ai-sdk/azure GPT-5, GPT-4o Enterprise Azure integration, custom endpoints
Google AI @ai-sdk/google Gemini 2.5, Gemini 3 Pro Multimodal, Google Cloud
Google Vertex @ai-sdk/google-vertex Claude, Gemini on GCP Anthropic on Google infra
Amazon Bedrock @ai-sdk/amazon-bedrock Nova, Claude, Llama 3 AWS credentials, regional prefixes
OpenRouter @openrouter/ai-sdk-provider 100+ models Multi-provider gateway
xAI @ai-sdk/xai Grok models Real-time data integration
Mistral AI @ai-sdk/mistral Mistral Large, Codestral Code-focused models
Groq @ai-sdk/groq Llama 3, Mixtral Ultra-low latency inference
DeepInfra @ai-sdk/deepinfra Open source models Cost-effective hosting
Cerebras @ai-sdk/cerebras Llama 3 variants Hardware-accelerated inference
Cohere @ai-sdk/cohere Command R+, Embed Enterprise RAG capabilities
Together AI @ai-sdk/togetherai Open source models Fine-tuning and hosting
Perplexity @ai-sdk/perplexity Sonar models Real-time web search
Vercel AI @ai-sdk/vercel Multi-provider gateway Edge hosting, rate limiting
GitLab @gitlab/gitlab-ai-provider GitLab Duo CI/CD AI integration
GitHub Copilot Custom GPT-5 series IDE integration, OAuth

Custom Loader Providers

Provider Auth Method Use Case
GitHub Copilot Enterprise OAuth + API Key Enterprise IDE integration
Google Vertex Anthropic GCP Service Account Claude on Google Cloud
Azure Cognitive Services Azure AD Azure AI services
Cloudflare AI Gateway Gateway Token Unified billing, rate limiting
SAP AI Core Service Key SAP enterprise integration
OpenCode Free None Free public models

Local/Self-Hosted

Provider Base URL Use Case
Ollama localhost:11434 Local model hosting
LM Studio localhost:1234 GUI local models
vLLM localhost:8000 High-performance serving
LocalAI localhost:8080 OpenAI-compatible local

Fetch Available Models

# OpenRouter - All models
curl -s https://openrouter.ai/api/v1/models \
  -H "Authorization: Bearer $OPENROUTER_API_KEY" | jq '.data[].id'

# OpenAI - GPT models
curl -s https://api.openai.com/v1/models \
  -H "Authorization: Bearer $OPENAI_API_KEY" | jq '.data[].id'

# Anthropic (static list)
# claude-opus-4-5-20250219, claude-sonnet-4-5-20250219, claude-3-5-sonnet-20241022

# Google Gemini
curl -s "https://generativelanguage.googleapis.com/v1/models?key=$GOOGLE_API_KEY"

# Groq
curl -s https://api.groq.com/openai/v1/models \
  -H "Authorization: Bearer $GROQ_API_KEY"

# Together AI
curl -s https://api.together.xyz/v1/models \
  -H "Authorization: Bearer $TOGETHER_API_KEY"

# Ollama (local)
curl -s http://localhost:11434/api/tags

# models.dev - Universal model list
curl -s https://models.dev/api/models.json

Multi-Provider Configuration

{
  "providers": {
    "anthropic": {
      "apiKey": "${ANTHROPIC_API_KEY}",
      "baseURL": "https://api.anthropic.com"
    },
    "openai": {
      "apiKey": "${OPENAI_API_KEY}",
      "baseURL": "https://api.openai.com/v1"
    },
    "azure": {
      "apiKey": "${AZURE_OPENAI_API_KEY}",
      "baseURL": "${AZURE_OPENAI_ENDPOINT}",
      "deployment": "gpt-4o"
    },
    "google": {
      "apiKey": "${GOOGLE_API_KEY}",
      "baseURL": "https://generativelanguage.googleapis.com/v1"
    },
    "vertex": {
      "projectId": "${GOOGLE_CLOUD_PROJECT}",
      "location": "${GOOGLE_CLOUD_LOCATION}",
      "credentials": "${GOOGLE_APPLICATION_CREDENTIALS}"
    },
    "bedrock": {
      "region": "us-east-1",
      "accessKeyId": "${AWS_ACCESS_KEY_ID}",
      "secretAccessKey": "${AWS_SECRET_ACCESS_KEY}"
    },
    "openrouter": {
      "apiKey": "${OPENROUTER_API_KEY}",
      "baseURL": "https://openrouter.ai/api/v1",
      "headers": {
        "HTTP-Referer": "https://yourapp.com",
        "X-Title": "YourApp"
      }
    },
    "xai": {
      "apiKey": "${XAI_API_KEY}",
      "baseURL": "https://api.x.ai/v1"
    },
    "mistral": {
      "apiKey": "${MISTRAL_API_KEY}",
      "baseURL": "https://api.mistral.ai/v1"
    },
    "groq": {
      "apiKey": "${GROQ_API_KEY}",
      "baseURL": "https://api.groq.com/openai/v1"
    },
    "cerebras": {
      "apiKey": "${CEREBRAS_API_KEY}",
      "baseURL": "https://api.cerebras.ai/v1"
    },
    "deepinfra": {
      "apiKey": "${DEEPINFRA_API_KEY}",
      "baseURL": "https://api.deepinfra.com/v1"
    },
    "cohere": {
      "apiKey": "${COHERE_API_KEY}",
      "baseURL": "https://api.cohere.ai/v1"
    },
    "together": {
      "apiKey": "${TOGETHER_API_KEY}",
      "baseURL": "https://api.together.xyz/v1"
    },
    "perplexity": {
      "apiKey": "${PERPLEXITY_API_KEY}",
      "baseURL": "https://api.perplexity.ai"
    },
    "vercel": {
      "apiKey": "${VERCEL_AI_KEY}",
      "baseURL": "https://api.vercel.ai/v1"
    },
    "gitlab": {
      "token": "${GITLAB_TOKEN}",
      "baseURL": "${GITLAB_URL}/api/v4"
    },
    "github": {
      "token": "${GITHUB_TOKEN}",
      "baseURL": "https://api.github.com"
    },
    "cloudflare": {
      "accountId": "${CF_ACCOUNT_ID}",
      "gatewayId": "${CF_GATEWAY_ID}",
      "token": "${CF_AI_TOKEN}"
    },
    "sap": {
      "serviceKey": "${AICORE_SERVICE_KEY}",
      "deploymentId": "${AICORE_DEPLOYMENT_ID}"
    },
    "ollama": {
      "baseURL": "http://localhost:11434/v1",
      "apiKey": "ollama"
    }
  },
  "agents": {
    "defaults": {
      "model": "anthropic/claude-sonnet-4-5",
      "temperature": 0.7,
      "maxTokens": 4096
    },
    "fast": {
      "model": "groq/llama-3.3-70b-versatile"
    },
    "coding": {
      "model": "anthropic/claude-sonnet-4-5"
    },
    "research": {
      "model": "perplexity/sonar-pro"
    },
    "local": {
      "model": "ollama/llama3.2:70b"
    }
  }
}

Custom Model Support

{
  "customModels": {
    "my-fine-tuned-gpt": {
      "provider": "openai",
      "modelId": "ft:gpt-4o:my-org:custom:suffix",
      "displayName": "My Custom GPT-4o"
    },
    "local-llama": {
      "provider": "ollama",
      "modelId": "llama3.2:70b",
      "displayName": "Local Llama 3.2 70B"
    },
    "openrouter-custom": {
      "provider": "openrouter",
      "modelId": "custom-org/my-model",
      "displayName": "Custom via OpenRouter"
    }
  }
}

Installation Commands

OpenClaw

git clone https://github.com/openclaw/openclaw.git
cd openclaw && npm install && npm run setup

NanoBot

pip install nanobot-ai
nanobot onboard

PicoClaw

wget https://github.com/sipeed/picoclaw/releases/latest/picoclaw-linux-amd64
chmod +x picoclaw-linux-amd64 && sudo mv picoclaw-linux-amd64 /usr/local/bin/picoclaw

ZeroClaw

wget https://github.com/zeroclaw-labs/zeroclaw/releases/latest/zeroclaw-linux-amd64
chmod +x zeroclaw-linux-amd64 && sudo mv zeroclaw-linux-amd64 /usr/local/bin/zeroclaw

Security Hardening

# Secrets in environment variables
export ANTHROPIC_API_KEY="your-key"
export OPENAI_API_KEY="your-key"

# Restricted config permissions
chmod 600 ~/.config/claw/config.json

# Systemd hardening
NoNewPrivileges=true
PrivateTmp=true
ProtectSystem=strict

Brainstorm Session Topics

  1. Use Case: Coding, research, productivity, automation?
  2. Model Selection: Claude, GPT, Gemini, local?
  3. Integrations: Telegram, Discord, calendar, storage?
  4. Deployment: Local, VPS, cloud?
  5. Custom Agents: Personality, memory, proactivity?