# π¦ Claw Setup
### Professional AI Agent Deployment Made Simple
**End-to-end setup of Claw platforms with 25+ AI providers, security hardening, and personal customization**
---
β¨ Autonomously developed by GLM 5 Advanced Coding Model
β οΈ Disclaimer: Test in a test environment prior to using on any live system
---
## Overview
Claw Setup handles complete deployment of AI Agent platforms with **25+ AI provider integrations** (OpenCode compatible).
```
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
β CLAW SETUP WORKFLOW β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ€
β β
β Phase 1 Phase 2 Phase 3 Phase 4 β
β ββββββββ ββββββββ ββββββββ ββββββββ β
β β
β βββββββββββ βββββββββββ βββββββββββ βββββββββββ β
β β SELECT ββββββΊβ INSTALL ββββββΊβCUSTOMIZEββββββΊβ DEPLOY β β
β β Platformβ β& Secure β βProvidersβ β & Run β β
β βββββββββββ βββββββββββ βββββββββββ βββββββββββ β
β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
```
## Platforms Supported
| Platform | Language | Memory | Startup | Best For |
|----------|----------|--------|---------|----------|
| **OpenClaw** | TypeScript | >1GB | ~500s | Full-featured, 1700+ plugins |
| **NanoBot** | Python | ~100MB | ~30s | Research, customization |
| **PicoClaw** | Go | <10MB | ~1s | Embedded, $10 hardware |
| **ZeroClaw** | Rust | <5MB | <10ms | Maximum performance |
| **NanoClaw** | TypeScript | ~50MB | ~5s | WhatsApp integration |
## AI Providers (25+ Supported)
### Tier 1: Major AI Labs
| Provider | Models | Features |
|----------|--------|----------|
| **Anthropic** | Claude 3.5/4/Opus | Extended thinking, PDF support |
| **OpenAI** | GPT-4o, o1, o3, GPT-5 | Function calling, structured output |
| **Google AI** | Gemini 2.5, Gemini 3 Pro | Multimodal, long context |
| **xAI** | Grok | Real-time data integration |
| **Mistral** | Mistral Large, Codestral | Code-focused models |
### Tier 2: Cloud Platforms
| Provider | Models | Features |
|----------|--------|----------|
| **Azure OpenAI** | GPT-5, GPT-4o Enterprise | Azure integration |
| **Google Vertex** | Claude, Gemini on GCP | Anthropic on Google |
| **Amazon Bedrock** | Nova, Claude, Llama 3 | AWS regional prefixes |
### Tier 3: Aggregators & Gateways
| Provider | Models | Features |
|----------|--------|----------|
| **OpenRouter** | 100+ models | Multi-provider gateway |
| **Vercel AI** | Multi-provider | Edge hosting, rate limiting |
| **Together AI** | Open source | Fine-tuning, hosting |
| **DeepInfra** | Open source | Cost-effective |
### Tier 4: Fast Inference
| Provider | Speed | Models |
|----------|-------|--------|
| **Groq** | Ultra-fast | Llama 3, Mixtral |
| **Cerebras** | Fastest | Llama 3 variants |
### Tier 5: Specialized
| Provider | Use Case |
|----------|----------|
| **Perplexity** | Web search integration |
| **Cohere** | Enterprise RAG |
| **GitLab Duo** | CI/CD integration |
| **GitHub Copilot** | IDE integration |
| **Cloudflare AI** | Gateway, rate limiting |
| **SAP AI Core** | SAP enterprise |
### Local/Self-Hosted
| Provider | Use Case |
|----------|----------|
| **Ollama** | Local model hosting |
| **LM Studio** | GUI local models |
| **vLLM** | High-performance serving |
## Model Selection
**Option A: Fetch from Provider**
```bash
# Fetch available models
curl -s https://openrouter.ai/api/v1/models -H "Authorization: Bearer $KEY" | jq '.data[].id'
curl -s https://api.groq.com/openai/v1/models -H "Authorization: Bearer $KEY"
curl -s http://localhost:11434/api/tags # Ollama
```
**Option B: Custom Model Input**
```json
{
"provider": "openai",
"modelId": "ft:gpt-4o:org:custom:suffix",
"displayName": "My Fine-Tuned Model"
}
```
## Quick Start
```
"Setup OpenClaw with Anthropic and OpenAI providers"
"Install NanoBot with all available providers"
"Deploy ZeroClaw with Groq for fast inference"
"Configure Claw with local Ollama models"
```
## Configuration Example
```json
{
"providers": {
"anthropic": { "apiKey": "${ANTHROPIC_API_KEY}" },
"openai": { "apiKey": "${OPENAI_API_KEY}" },
"google": { "apiKey": "${GOOGLE_API_KEY}" },
"openrouter": { "apiKey": "${OPENROUTER_API_KEY}" },
"groq": { "apiKey": "${GROQ_API_KEY}" },
"ollama": { "baseURL": "http://localhost:11434" }
},
"agents": {
"defaults": { "model": "anthropic/claude-sonnet-4-5" },
"fast": { "model": "groq/llama-3.3-70b-versatile" },
"local": { "model": "ollama/llama3.2:70b" }
}
}
```
## Security
- API keys via environment variables
- Restricted config permissions (chmod 600)
- Systemd hardening (NoNewPrivileges, PrivateTmp)
- Network binding to localhost
---