feat: Add Claw Setup skill for AI Agent deployment
End-to-end professional setup of AI Agent platforms: - OpenClaw (full-featured, 215K stars) - NanoBot (Python, lightweight) - PicoClaw (Go, ultra-light) - ZeroClaw (Rust, minimal) - NanoClaw (WhatsApp focused) Features: - Platform selection with comparison - Security hardening (secrets, network, systemd) - Interactive brainstorming for customization - AI provider configuration with 12+ providers - Model fetching from provider APIs - Custom model input support Providers supported: Anthropic, OpenAI, Google, OpenRouter, Groq, Cerebras, Together AI, DeepSeek, Mistral, xAI, Ollama Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
464
skills/claw-setup/SKILL.md
Normal file
464
skills/claw-setup/SKILL.md
Normal file
@@ -0,0 +1,464 @@
|
||||
---
|
||||
name: claw-setup
|
||||
description: Use this skill when the user asks to "setup openclaw", "install nanobot", "deploy zeroclaw", "configure picoclaw", "AI agent setup", "personal AI assistant", "claw framework", or mentions setting up any AI agent/assistant platform from the Claw family (OpenClaw, NanoBot, PicoClaw, ZeroClaw, NanoClaw).
|
||||
version: 1.0.0
|
||||
---
|
||||
|
||||
# Claw Setup Skill
|
||||
|
||||
End-to-end professional setup of AI Agent platforms from the Claw family with security hardening and personal customization through interactive brainstorming.
|
||||
|
||||
## Supported Platforms
|
||||
|
||||
| Platform | Language | Memory | Startup | Best For |
|
||||
|----------|----------|--------|---------|----------|
|
||||
| **OpenClaw** | TypeScript | >1GB | ~500s | Full-featured, plugin ecosystem |
|
||||
| **NanoBot** | Python | ~100MB | ~30s | Research, easy customization |
|
||||
| **PicoClaw** | Go | <10MB | ~1s | Low-resource, embedded |
|
||||
| **ZeroClaw** | Rust | <5MB | <10ms | Maximum performance, security |
|
||||
| **NanoClaw** | TypeScript | ~50MB | ~5s | WhatsApp integration |
|
||||
|
||||
## What This Skill Does
|
||||
|
||||
### Phase 1: Platform Selection
|
||||
- Interactive comparison of all platforms
|
||||
- Hardware requirements check
|
||||
- Use case matching
|
||||
|
||||
### Phase 2: Secure Installation
|
||||
- Clone from official GitHub repos
|
||||
- Security hardening (secrets management, network isolation)
|
||||
- Environment configuration
|
||||
- API key setup with best practices
|
||||
|
||||
### Phase 3: Personal Customization
|
||||
- Interactive brainstorming session
|
||||
- Custom agent templates
|
||||
- Integration setup (messaging, calendar, etc.)
|
||||
- Memory and context configuration
|
||||
|
||||
### Phase 4: Verification & Deployment
|
||||
- Health checks
|
||||
- Test runs
|
||||
- Production deployment options
|
||||
|
||||
## GitHub Repositories
|
||||
|
||||
```
|
||||
OpenClaw: https://github.com/openclaw/openclaw
|
||||
NanoBot: https://github.com/HKUDS/nanobot
|
||||
PicoClaw: https://github.com/sipeed/picoclaw
|
||||
ZeroClaw: https://github.com/zeroclaw-labs/zeroclaw
|
||||
NanoClaw: https://github.com/nanoclaw/nanoclaw
|
||||
```
|
||||
|
||||
## Usage Examples
|
||||
|
||||
```
|
||||
"Setup OpenClaw on my server"
|
||||
"I want to install NanoBot for personal use"
|
||||
"Help me choose between ZeroClaw and PicoClaw"
|
||||
"Deploy an AI assistant with security best practices"
|
||||
"Setup Claw framework with my custom requirements"
|
||||
```
|
||||
|
||||
## Installation Commands by Platform
|
||||
|
||||
### OpenClaw (Full Featured)
|
||||
```bash
|
||||
# Prerequisites
|
||||
sudo apt install -y nodejs npm
|
||||
|
||||
# Clone and setup
|
||||
git clone https://github.com/openclaw/openclaw.git
|
||||
cd openclaw
|
||||
npm install
|
||||
npm run setup
|
||||
|
||||
# Configure
|
||||
cp .env.example .env
|
||||
# Edit .env with API keys
|
||||
|
||||
# Run
|
||||
npm run start
|
||||
```
|
||||
|
||||
### NanoBot (Python Lightweight)
|
||||
```bash
|
||||
# Quick install
|
||||
pip install nanobot-ai
|
||||
|
||||
# Or from source
|
||||
git clone https://github.com/HKUDS/nanobot.git
|
||||
cd nanobot
|
||||
pip install -e .
|
||||
|
||||
# Setup
|
||||
nanobot onboard
|
||||
nanobot gateway
|
||||
```
|
||||
|
||||
### PicoClaw (Go Ultra-Light)
|
||||
```bash
|
||||
# Download binary
|
||||
wget https://github.com/sipeed/picoclaw/releases/latest/picoclaw-linux-amd64
|
||||
chmod +x picoclaw-linux-amd64
|
||||
sudo mv picoclaw-linux-amd64 /usr/local/bin/picoclaw
|
||||
|
||||
# Or build from source
|
||||
git clone https://github.com/sipeed/picoclaw.git
|
||||
cd picoclaw
|
||||
go build -o picoclaw
|
||||
|
||||
# Run
|
||||
picoclaw gateway
|
||||
```
|
||||
|
||||
### ZeroClaw (Rust Minimal)
|
||||
```bash
|
||||
# Download binary
|
||||
wget https://github.com/zeroclaw-labs/zeroclaw/releases/latest/zeroclaw-linux-amd64
|
||||
chmod +x zeroclaw-linux-amd64
|
||||
sudo mv zeroclaw-linux-amd64 /usr/local/bin/zeroclaw
|
||||
|
||||
# Or from source
|
||||
git clone https://github.com/zeroclaw-labs/zeroclaw.git
|
||||
cd zeroclaw
|
||||
cargo build --release
|
||||
|
||||
# Run
|
||||
zeroclaw gateway
|
||||
```
|
||||
|
||||
## Security Hardening
|
||||
|
||||
### Secrets Management
|
||||
```bash
|
||||
# Never commit .env files
|
||||
echo ".env" >> .gitignore
|
||||
echo "*.pem" >> .gitignore
|
||||
|
||||
# Use environment variables
|
||||
export ANTHROPIC_API_KEY="your-key"
|
||||
export OPENROUTER_API_KEY="your-key"
|
||||
|
||||
# Or use secret files with restricted permissions
|
||||
mkdir -p ~/.config/claw
|
||||
cat > ~/.config/claw/config.json << 'CONFIG'
|
||||
{
|
||||
"providers": {
|
||||
"openrouter": { "apiKey": "${OPENROUTER_API_KEY}" }
|
||||
}
|
||||
}
|
||||
CONFIG
|
||||
chmod 600 ~/.config/claw/config.json
|
||||
```
|
||||
|
||||
### Network Security
|
||||
```bash
|
||||
# Bind to localhost only
|
||||
# In config, set:
|
||||
# "server": { "host": "127.0.0.1", "port": 3000 }
|
||||
|
||||
# Use reverse proxy for external access
|
||||
# nginx example:
|
||||
server {
|
||||
listen 443 ssl;
|
||||
server_name claw.yourdomain.com;
|
||||
|
||||
ssl_certificate /etc/letsencrypt/live/yourdomain.com/fullchain.pem;
|
||||
ssl_certificate_key /etc/letsencrypt/live/yourdomain.com/privkey.pem;
|
||||
|
||||
location / {
|
||||
proxy_pass http://127.0.0.1:3000;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Systemd Service
|
||||
```bash
|
||||
# /etc/systemd/system/claw.service
|
||||
[Unit]
|
||||
Description=Claw AI Assistant
|
||||
After=network.target
|
||||
|
||||
[Service]
|
||||
Type=simple
|
||||
User=claw
|
||||
Group=claw
|
||||
WorkingDirectory=/opt/claw
|
||||
ExecStart=/usr/local/bin/claw gateway
|
||||
Restart=on-failure
|
||||
RestartSec=10
|
||||
|
||||
# Security hardening
|
||||
NoNewPrivileges=true
|
||||
PrivateTmp=true
|
||||
ProtectSystem=strict
|
||||
ProtectHome=true
|
||||
ReadWritePaths=/opt/claw/data
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
## Brainstorm Session Topics
|
||||
|
||||
1. **Use Case Discovery**
|
||||
- What tasks should the AI handle?
|
||||
- Which platforms/channels to integrate?
|
||||
- Automation vs. interactive preferences?
|
||||
|
||||
2. **Model Selection**
|
||||
- Claude, GPT, Gemini, or local models?
|
||||
- Cost vs. performance tradeoffs?
|
||||
- Privacy requirements?
|
||||
|
||||
3. **Integration Planning**
|
||||
- Messaging: Telegram, Discord, WhatsApp, Slack?
|
||||
- Calendar: Google, Outlook, Apple?
|
||||
- Storage: Local, cloud, hybrid?
|
||||
- APIs to connect?
|
||||
|
||||
4. **Custom Agent Design**
|
||||
- Personality and tone?
|
||||
- Domain expertise areas?
|
||||
- Memory and context preferences?
|
||||
- Proactive vs. reactive behavior?
|
||||
|
||||
5. **Deployment Strategy**
|
||||
- Local machine, VPS, or cloud?
|
||||
- High availability requirements?
|
||||
- Backup and recovery needs?
|
||||
|
||||
## AI Provider Configuration
|
||||
|
||||
### Supported Providers
|
||||
|
||||
| Provider | Type | API Base | Models |
|
||||
|----------|------|----------|--------|
|
||||
| **Anthropic** | Direct | api.anthropic.com | Claude 3.5/4/Opus |
|
||||
| **OpenAI** | Direct | api.openai.com | GPT-4, GPT-4o, o1, o3 |
|
||||
| **Google** | Direct | generativelanguage.googleapis.com | Gemini 2.0/1.5 |
|
||||
| **OpenRouter** | Gateway | openrouter.ai/api | 200+ models |
|
||||
| **Together AI** | Direct | api.together.xyz | Llama, Mistral, Qwen |
|
||||
| **Groq** | Direct | api.groq.com | Llama, Mixtral (fast) |
|
||||
| **Cerebras** | Direct | api.cerebras.ai | Llama (fastest) |
|
||||
| **DeepSeek** | Direct | api.deepseek.com | DeepSeek V3/R1 |
|
||||
| **Mistral** | Direct | api.mistral.ai | Mistral, Codestral |
|
||||
| **xAI** | Direct | api.x.ai | Grok |
|
||||
| **Replicate** | Gateway | api.replicate.com | Various |
|
||||
| **Local** | Self-hosted | localhost | Ollama, LM Studio |
|
||||
|
||||
### Fetch Available Models
|
||||
|
||||
```bash
|
||||
# OpenRouter - List all models
|
||||
curl -s https://openrouter.ai/api/v1/models \
|
||||
-H "Authorization: Bearer $OPENROUTER_API_KEY" | jq '.data[].id'
|
||||
|
||||
# OpenAI - List models
|
||||
curl -s https://api.openai.com/v1/models \
|
||||
-H "Authorization: Bearer $OPENAI_API_KEY" | jq '.data[].id'
|
||||
|
||||
# Anthropic - Available models (static list)
|
||||
# claude-opus-4-5-20250219
|
||||
# claude-sonnet-4-5-20250219
|
||||
# claude-3-5-sonnet-20241022
|
||||
# claude-3-5-haiku-20241022
|
||||
|
||||
# Google Gemini
|
||||
curl -s "https://generativelanguage.googleapis.com/v1/models?key=$GOOGLE_API_KEY" | jq '.models[].name'
|
||||
|
||||
# Groq - List models
|
||||
curl -s https://api.groq.com/openai/v1/models \
|
||||
-H "Authorization: Bearer $GROQ_API_KEY" | jq '.data[].id'
|
||||
|
||||
# Together AI
|
||||
curl -s https://api.together.xyz/v1/models \
|
||||
-H "Authorization: Bearer $TOGETHER_API_KEY" | jq '.data[].id'
|
||||
|
||||
# Ollama (local)
|
||||
curl -s http://localhost:11434/api/tags | jq '.models[].name'
|
||||
```
|
||||
|
||||
### Configuration Templates
|
||||
|
||||
#### Multi-Provider Config
|
||||
```json
|
||||
{
|
||||
"providers": {
|
||||
"anthropic": {
|
||||
"apiKey": "${ANTHROPIC_API_KEY}",
|
||||
"baseURL": "https://api.anthropic.com"
|
||||
},
|
||||
"openai": {
|
||||
"apiKey": "${OPENAI_API_KEY}",
|
||||
"baseURL": "https://api.openai.com/v1"
|
||||
},
|
||||
"google": {
|
||||
"apiKey": "${GOOGLE_API_KEY}",
|
||||
"baseURL": "https://generativelanguage.googleapis.com/v1"
|
||||
},
|
||||
"openrouter": {
|
||||
"apiKey": "${OPENROUTER_API_KEY}",
|
||||
"baseURL": "https://openrouter.ai/api/v1"
|
||||
},
|
||||
"groq": {
|
||||
"apiKey": "${GROQ_API_KEY}",
|
||||
"baseURL": "https://api.groq.com/openai/v1"
|
||||
},
|
||||
"together": {
|
||||
"apiKey": "${TOGETHER_API_KEY}",
|
||||
"baseURL": "https://api.together.xyz/v1"
|
||||
},
|
||||
"deepseek": {
|
||||
"apiKey": "${DEEPSEEK_API_KEY}",
|
||||
"baseURL": "https://api.deepseek.com/v1"
|
||||
},
|
||||
"mistral": {
|
||||
"apiKey": "${MISTRAL_API_KEY}",
|
||||
"baseURL": "https://api.mistral.ai/v1"
|
||||
},
|
||||
"xai": {
|
||||
"apiKey": "${XAI_API_KEY}",
|
||||
"baseURL": "https://api.x.ai/v1"
|
||||
},
|
||||
"ollama": {
|
||||
"baseURL": "http://localhost:11434/v1",
|
||||
"apiKey": "ollama"
|
||||
}
|
||||
},
|
||||
"agents": {
|
||||
"defaults": {
|
||||
"model": "anthropic/claude-sonnet-4-5",
|
||||
"temperature": 0.7,
|
||||
"maxTokens": 4096
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### Custom Model Configuration
|
||||
```json
|
||||
{
|
||||
"customModels": {
|
||||
"my-fine-tuned-model": {
|
||||
"provider": "openai",
|
||||
"modelId": "ft:gpt-4o:my-org:custom:suffix",
|
||||
"displayName": "My Custom GPT-4o"
|
||||
},
|
||||
"local-llama": {
|
||||
"provider": "ollama",
|
||||
"modelId": "llama3.2:70b",
|
||||
"displayName": "Local Llama 3.2 70B"
|
||||
},
|
||||
"openrouter-model": {
|
||||
"provider": "openrouter",
|
||||
"modelId": "meta-llama/llama-3.3-70b-instruct",
|
||||
"displayName": "Llama 3.3 70B via OpenRouter"
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Provider Selection Flow
|
||||
|
||||
```
|
||||
1. Ask user which providers they have API keys for:
|
||||
□ Anthropic (Claude)
|
||||
□ OpenAI (GPT)
|
||||
□ Google (Gemini)
|
||||
□ OpenRouter (Multi-model)
|
||||
□ Together AI
|
||||
□ Groq (Fast inference)
|
||||
□ Cerebras (Fastest)
|
||||
□ DeepSeek
|
||||
□ Mistral
|
||||
□ xAI (Grok)
|
||||
□ Local (Ollama/LM Studio)
|
||||
|
||||
2. For each selected provider:
|
||||
- Prompt for API key
|
||||
- Fetch available models (if API supports)
|
||||
- Let user select or input custom model
|
||||
|
||||
3. Generate secure configuration:
|
||||
- Store keys in environment variables
|
||||
- Create config.json with model selections
|
||||
- Set up key rotation reminders
|
||||
|
||||
4. Test connectivity:
|
||||
- Send test prompt to each configured provider
|
||||
- Verify response
|
||||
```
|
||||
|
||||
### Model Fetching Script
|
||||
|
||||
```bash
|
||||
#!/bin/bash
|
||||
# fetch-models.sh - Fetch available models from providers
|
||||
|
||||
echo "=== AI Provider Model Fetcher ==="
|
||||
|
||||
# OpenRouter
|
||||
if [ -n "$OPENROUTER_API_KEY" ]; then
|
||||
echo -e "\n📦 OpenRouter Models:"
|
||||
curl -s https://openrouter.ai/api/v1/models \
|
||||
-H "Authorization: Bearer $OPENROUTER_API_KEY" | \
|
||||
jq -r '.data[] | " • \(.id) - \(.name // .id)"' | head -20
|
||||
fi
|
||||
|
||||
# OpenAI
|
||||
if [ -n "$OPENAI_API_KEY" ]; then
|
||||
echo -e "\n📦 OpenAI Models:"
|
||||
curl -s https://api.openai.com/v1/models \
|
||||
-H "Authorization: Bearer $OPENAI_API_KEY" | \
|
||||
jq -r '.data[] | select(.id | contains("gpt")) | " • \(.id)"' | sort -u
|
||||
fi
|
||||
|
||||
# Groq
|
||||
if [ -n "$GROQ_API_KEY" ]; then
|
||||
echo -e "\n📦 Groq Models:"
|
||||
curl -s https://api.groq.com/openai/v1/models \
|
||||
-H "Authorization: Bearer $GROQ_API_KEY" | \
|
||||
jq -r '.data[].id' | sed 's/^/ • /'
|
||||
fi
|
||||
|
||||
# Ollama (local)
|
||||
echo -e "\n📦 Ollama Models (local):"
|
||||
curl -s http://localhost:11434/api/tags 2>/dev/null | \
|
||||
jq -r '.models[].name' | sed 's/^/ • /' || echo " Ollama not running"
|
||||
|
||||
# Together AI
|
||||
if [ -n "$TOGETHER_API_KEY" ]; then
|
||||
echo -e "\n📦 Together AI Models:"
|
||||
curl -s https://api.together.xyz/v1/models \
|
||||
-H "Authorization: Bearer $TOGETHER_API_KEY" | \
|
||||
jq -r '.data[].id' | head -20 | sed 's/^/ • /'
|
||||
fi
|
||||
|
||||
echo -e "\n✅ Model fetch complete"
|
||||
```
|
||||
|
||||
### Custom Model Input
|
||||
|
||||
When user selects "Custom Model", prompt for:
|
||||
1. **Provider**: Which provider hosts this model
|
||||
2. **Model ID**: Exact model identifier
|
||||
3. **Display Name**: Friendly name for UI
|
||||
4. **Context Window**: Max tokens (optional)
|
||||
5. **Capabilities**: Text, vision, code, etc. (optional)
|
||||
|
||||
Example custom model entry:
|
||||
```json
|
||||
{
|
||||
"provider": "openrouter",
|
||||
"modelId": "custom-org/my-fine-tuned-v2",
|
||||
"displayName": "My Fine-Tuned Model v2",
|
||||
"contextWindow": 128000,
|
||||
"capabilities": ["text", "code"]
|
||||
}
|
||||
```
|
||||
Reference in New Issue
Block a user