Commit Graph

10 Commits

  • docs: add Qwen OAuth auth URL (portal.qwen.ai)
    - Add browser auth URL for re-authentication
    - Clarify auth vs refresh vs API endpoints
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • docs: set qwen as default provider with coder-model
    - Mark Qwen OAuth as recommended default provider
    - Update model reference to coder-model (qwen3-coder-plus)
    - Add default provider setup instructions
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • feat: use correct DashScope API endpoint for Qwen OAuth
    Based on ZeroClaw implementation study:
    - Change API endpoint from api.qwen.ai to dashscope.aliyuncs.com/compatible-mode/v1
    - Update credentials file reference to oauth_creds.json
    - Add ZeroClaw native qwen-oauth provider documentation
    - Add API endpoints and models reference table
    - Update import script with correct endpoint and platform support
    - Add PicoClaw and NanoClaw platform configurations
    
    Key findings from ZeroClaw binary:
    - Native qwen-oauth provider with auto token refresh
    - Uses DashScope OpenAI-compatible endpoint
    - Reads ~/.qwen/oauth_creds.json directly
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • docs: clarify ZeroClaw native qwen-oauth vs OpenAI-compatible import
    - Document ZeroClaw's native qwen-oauth provider with auto token refresh
    - Explain two import methods: Native vs OpenAI-compatible
    - Add OAuth credentials structure documentation
    - Add comparison table showing feature differences
    - Update platform table to show ZeroClaw has Native (not Import) OAuth support
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • docs: Comprehensive claw-setup skill documentation
    Added complete documentation covering all features:
    
    FEATURES DOCUMENTED:
    1. FREE Qwen OAuth Cross-Platform Import
       - 2,000 requests/day free tier
       - Works with ALL Claw platforms
       - Platform-specific import guides
    
    2. 25+ OpenCode-Compatible AI Providers
       - Tier 1: FREE (Qwen OAuth)
       - Tier 2: Major Labs (Anthropic, OpenAI, Google, xAI, Mistral)
       - Tier 3: Cloud (Azure, Bedrock, Vertex)
       - Tier 4: Gateways (OpenRouter 100+, Together AI)
       - Tier 5: Fast (Groq, Cerebras)
       - Tier 6: Specialized (Perplexity, Cohere, GitLab)
       - Tier 7: Local (Ollama, LM Studio, vLLM)
    
    3. Customization Options
       - Model selection (fetch or custom)
       - Security hardening
       - Interactive brainstorming
       - Multi-provider configuration
    
    4. Installation Guides
       - All 6 platforms with step-by-step instructions
    
    5. Configuration Examples
       - Multi-provider setup
       - Environment variables
       - Custom models
    
    6. Usage Examples
       - Basic, advanced, and provider-specific
    
    7. Troubleshooting
       - Common issues and solutions
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • docs: Comprehensive documentation for 25+ providers + Qwen OAuth
    Restructured documentation to highlight both key features:
    
    FEATURE 1: Qwen OAuth Cross-Platform Import (FREE)
    - 2,000 requests/day free tier
    - Works with ALL Claw platforms
    - Browser OAuth via qwen.ai
    - Model: Qwen3-Coder
    
    FEATURE 2: 25+ OpenCode-Compatible Providers
    - Major AI Labs: Anthropic, OpenAI, Google, xAI, Mistral
    - Cloud Platforms: Azure, AWS Bedrock, Google Vertex
    - Fast Inference: Groq, Cerebras
    - Gateways: OpenRouter (100+ models), Together AI
    - Local: Ollama, LM Studio, vLLM
    
    Provider Tiers:
    1. FREE: Qwen OAuth
    2. Major Labs: Anthropic, OpenAI, Google, xAI, Mistral
    3. Cloud: Azure, Bedrock, Vertex
    4. Fast: Groq, Cerebras
    5. Gateways: OpenRouter, Together AI, Vercel
    6. Specialized: Perplexity, Cohere, GitLab, GitHub
    7. Local: Ollama, LM Studio, vLLM
    
    Platforms with full support:
    - Qwen Code (native OAuth)
    - OpenClaw, NanoBot, PicoClaw, ZeroClaw (import OAuth)
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • feat: Add Qwen OAuth cross-platform import for ALL Claw platforms
    Key Feature: Use FREE Qwen tier (2,000 req/day) with ANY platform!
    
    How it works:
    1. Get Qwen OAuth: qwen && /auth (FREE)
    2. Extract token from ~/.qwen/
    3. Configure any platform with token
    
    Supported platforms:
    - OpenClaw 
    - NanoBot 
    - PicoClaw 
    - ZeroClaw 
    - NanoClaw 
    
    Configuration:
      export OPENAI_API_KEY="$QWEN_TOKEN"
      export OPENAI_BASE_URL="https://api.qwen.ai/v1"
      export OPENAI_MODEL="qwen3-coder-plus"
    
    Added:
    - import-qwen-oauth.sh script for automation
    - Cross-platform configuration examples
    - Qwen API endpoints reference
    - Troubleshooting guide
    
    Free tier: 2,000 requests/day, 60 requests/minute
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • feat: Add Qwen Code with FREE OAuth tier (2,000 requests/day)
    New platform option with no API key required:
    
    Qwen Code Features:
    - FREE OAuth tier: 2,000 requests/day
    - Model: Qwen3-Coder (coder-model)
    - Auth: Browser OAuth via qwen.ai
    - GitHub: https://github.com/QwenLM/qwen-code
    
    Installation:
      npm install -g @qwen-code/qwen-code@latest
      qwen
      /auth  # Select Qwen OAuth
    
    Platform comparison updated:
    - Qwen Code: FREE, ~200MB, coding-optimized
    - OpenClaw: Full-featured, 1700+ plugins
    - NanoBot: Python, research
    - PicoClaw: Go, <10MB
    - ZeroClaw: Rust, <5MB
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • feat: Add all 25+ OpenCode-compatible AI providers to Claw Setup
    Updated provider support to match OpenCode's full provider list:
    
    Built-in Providers (18):
    - Anthropic, OpenAI, Azure OpenAI
    - Google AI, Google Vertex AI
    - Amazon Bedrock
    - OpenRouter, xAI, Mistral
    - Groq, Cerebras, DeepInfra
    - Cohere, Together AI, Perplexity
    - Vercel AI, GitLab, GitHub Copilot
    
    Custom Loader Providers:
    - GitHub Copilot Enterprise
    - Google Vertex Anthropic
    - Azure Cognitive Services
    - Cloudflare AI Gateway
    - SAP AI Core
    
    Local/Self-Hosted:
    - Ollama, LM Studio, vLLM
    
    Features:
    - Model fetching from provider APIs
    - Custom model input support
    - Multi-provider configuration
    - Environment variable security
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
  • feat: Add Claw Setup skill for AI Agent deployment
    End-to-end professional setup of AI Agent platforms:
    - OpenClaw (full-featured, 215K stars)
    - NanoBot (Python, lightweight)
    - PicoClaw (Go, ultra-light)
    - ZeroClaw (Rust, minimal)
    - NanoClaw (WhatsApp focused)
    
    Features:
    - Platform selection with comparison
    - Security hardening (secrets, network, systemd)
    - Interactive brainstorming for customization
    - AI provider configuration with 12+ providers
    - Model fetching from provider APIs
    - Custom model input support
    
    Providers supported:
    Anthropic, OpenAI, Google, OpenRouter, Groq,
    Cerebras, Together AI, DeepSeek, Mistral, xAI, Ollama
    
    Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>