Updated provider support to match OpenCode's full provider list: Built-in Providers (18): - Anthropic, OpenAI, Azure OpenAI - Google AI, Google Vertex AI - Amazon Bedrock - OpenRouter, xAI, Mistral - Groq, Cerebras, DeepInfra - Cohere, Together AI, Perplexity - Vercel AI, GitLab, GitHub Copilot Custom Loader Providers: - GitHub Copilot Enterprise - Google Vertex Anthropic - Azure Cognitive Services - Cloudflare AI Gateway - SAP AI Core Local/Self-Hosted: - Ollama, LM Studio, vLLM Features: - Model fetching from provider APIs - Custom model input support - Multi-provider configuration - Environment variable security Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
174 lines
6.1 KiB
Markdown
174 lines
6.1 KiB
Markdown
<div align="center">
|
|
|
|
# 🦞 Claw Setup
|
|
|
|
### Professional AI Agent Deployment Made Simple
|
|
|
|
**End-to-end setup of Claw platforms with 25+ AI providers, security hardening, and personal customization**
|
|
|
|
---
|
|
|
|
<p align="center">
|
|
<a href="https://z.ai/subscribe?ic=R0K78RJKNW">
|
|
<img src="https://img.shields.io/badge/Designed%20by-GLM%205%20Advanced%20Coding%20Model-blue?style=for-the-badge" alt="Designed by GLM 5">
|
|
</a>
|
|
</p>
|
|
|
|
<p align="center">
|
|
<i>✨ Autonomously developed by <a href="https://z.ai/subscribe?ic=R0K78RJKNW"><strong>GLM 5 Advanced Coding Model</strong></a></i>
|
|
</p>
|
|
|
|
<p align="center">
|
|
<b>⚠️ Disclaimer: Test in a test environment prior to using on any live system</b>
|
|
</p>
|
|
|
|
---
|
|
|
|
</div>
|
|
|
|
## Overview
|
|
|
|
Claw Setup handles complete deployment of AI Agent platforms with **25+ AI provider integrations** (OpenCode compatible).
|
|
|
|
```
|
|
┌─────────────────────────────────────────────────────────────────┐
|
|
│ CLAW SETUP WORKFLOW │
|
|
├─────────────────────────────────────────────────────────────────┤
|
|
│ │
|
|
│ Phase 1 Phase 2 Phase 3 Phase 4 │
|
|
│ ──────── ──────── ──────── ──────── │
|
|
│ │
|
|
│ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ │
|
|
│ │ SELECT │────►│ INSTALL │────►│CUSTOMIZE│────►│ DEPLOY │ │
|
|
│ │ Platform│ │& Secure │ │Providers│ │ & Run │ │
|
|
│ └─────────┘ └─────────┘ └─────────┘ └─────────┘ │
|
|
│ │
|
|
└─────────────────────────────────────────────────────────────────┘
|
|
```
|
|
|
|
## Platforms Supported
|
|
|
|
| Platform | Language | Memory | Startup | Best For |
|
|
|----------|----------|--------|---------|----------|
|
|
| **OpenClaw** | TypeScript | >1GB | ~500s | Full-featured, 1700+ plugins |
|
|
| **NanoBot** | Python | ~100MB | ~30s | Research, customization |
|
|
| **PicoClaw** | Go | <10MB | ~1s | Embedded, $10 hardware |
|
|
| **ZeroClaw** | Rust | <5MB | <10ms | Maximum performance |
|
|
| **NanoClaw** | TypeScript | ~50MB | ~5s | WhatsApp integration |
|
|
|
|
## AI Providers (25+ Supported)
|
|
|
|
### Tier 1: Major AI Labs
|
|
|
|
| Provider | Models | Features |
|
|
|----------|--------|----------|
|
|
| **Anthropic** | Claude 3.5/4/Opus | Extended thinking, PDF support |
|
|
| **OpenAI** | GPT-4o, o1, o3, GPT-5 | Function calling, structured output |
|
|
| **Google AI** | Gemini 2.5, Gemini 3 Pro | Multimodal, long context |
|
|
| **xAI** | Grok | Real-time data integration |
|
|
| **Mistral** | Mistral Large, Codestral | Code-focused models |
|
|
|
|
### Tier 2: Cloud Platforms
|
|
|
|
| Provider | Models | Features |
|
|
|----------|--------|----------|
|
|
| **Azure OpenAI** | GPT-5, GPT-4o Enterprise | Azure integration |
|
|
| **Google Vertex** | Claude, Gemini on GCP | Anthropic on Google |
|
|
| **Amazon Bedrock** | Nova, Claude, Llama 3 | AWS regional prefixes |
|
|
|
|
### Tier 3: Aggregators & Gateways
|
|
|
|
| Provider | Models | Features |
|
|
|----------|--------|----------|
|
|
| **OpenRouter** | 100+ models | Multi-provider gateway |
|
|
| **Vercel AI** | Multi-provider | Edge hosting, rate limiting |
|
|
| **Together AI** | Open source | Fine-tuning, hosting |
|
|
| **DeepInfra** | Open source | Cost-effective |
|
|
|
|
### Tier 4: Fast Inference
|
|
|
|
| Provider | Speed | Models |
|
|
|----------|-------|--------|
|
|
| **Groq** | Ultra-fast | Llama 3, Mixtral |
|
|
| **Cerebras** | Fastest | Llama 3 variants |
|
|
|
|
### Tier 5: Specialized
|
|
|
|
| Provider | Use Case |
|
|
|----------|----------|
|
|
| **Perplexity** | Web search integration |
|
|
| **Cohere** | Enterprise RAG |
|
|
| **GitLab Duo** | CI/CD integration |
|
|
| **GitHub Copilot** | IDE integration |
|
|
| **Cloudflare AI** | Gateway, rate limiting |
|
|
| **SAP AI Core** | SAP enterprise |
|
|
|
|
### Local/Self-Hosted
|
|
|
|
| Provider | Use Case |
|
|
|----------|----------|
|
|
| **Ollama** | Local model hosting |
|
|
| **LM Studio** | GUI local models |
|
|
| **vLLM** | High-performance serving |
|
|
|
|
## Model Selection
|
|
|
|
**Option A: Fetch from Provider**
|
|
```bash
|
|
# Fetch available models
|
|
curl -s https://openrouter.ai/api/v1/models -H "Authorization: Bearer $KEY" | jq '.data[].id'
|
|
curl -s https://api.groq.com/openai/v1/models -H "Authorization: Bearer $KEY"
|
|
curl -s http://localhost:11434/api/tags # Ollama
|
|
```
|
|
|
|
**Option B: Custom Model Input**
|
|
```json
|
|
{
|
|
"provider": "openai",
|
|
"modelId": "ft:gpt-4o:org:custom:suffix",
|
|
"displayName": "My Fine-Tuned Model"
|
|
}
|
|
```
|
|
|
|
## Quick Start
|
|
|
|
```
|
|
"Setup OpenClaw with Anthropic and OpenAI providers"
|
|
"Install NanoBot with all available providers"
|
|
"Deploy ZeroClaw with Groq for fast inference"
|
|
"Configure Claw with local Ollama models"
|
|
```
|
|
|
|
## Configuration Example
|
|
|
|
```json
|
|
{
|
|
"providers": {
|
|
"anthropic": { "apiKey": "${ANTHROPIC_API_KEY}" },
|
|
"openai": { "apiKey": "${OPENAI_API_KEY}" },
|
|
"google": { "apiKey": "${GOOGLE_API_KEY}" },
|
|
"openrouter": { "apiKey": "${OPENROUTER_API_KEY}" },
|
|
"groq": { "apiKey": "${GROQ_API_KEY}" },
|
|
"ollama": { "baseURL": "http://localhost:11434" }
|
|
},
|
|
"agents": {
|
|
"defaults": { "model": "anthropic/claude-sonnet-4-5" },
|
|
"fast": { "model": "groq/llama-3.3-70b-versatile" },
|
|
"local": { "model": "ollama/llama3.2:70b" }
|
|
}
|
|
}
|
|
```
|
|
|
|
## Security
|
|
|
|
- API keys via environment variables
|
|
- Restricted config permissions (chmod 600)
|
|
- Systemd hardening (NoNewPrivileges, PrivateTmp)
|
|
- Network binding to localhost
|
|
|
|
---
|
|
|
|
<p align="center">
|
|
<a href="https://z.ai/subscribe?ic=R0K78RJKNW">Learn more about GLM 5 Advanced Coding Model</a>
|
|
</p>
|