Commit Graph

9 Commits

  • feat: Add Antigravity provider integration + fix native mode startup
    - Added Antigravity AI provider with Google OAuth authentication
    - New integration client (antigravity.ts) with automatic endpoint fallback
    - API routes for /api/antigravity/* (models, auth-status, test, chat)
    - AntigravitySettings.tsx for Advanced Settings panel
    - Updated session-api.ts and session-actions.ts for provider routing
    - Updated opencode.jsonc with Antigravity plugin and 11 models:
      - Gemini 3 Pro Low/High, Gemini 3 Flash
      - Claude Sonnet 4.5 (+ thinking variants)
      - Claude Opus 4.5 (+ thinking variants)
      - GPT-OSS 120B Medium
    
    - Fixed native mode startup error (was trying to launch __nomadarch_native__ as binary)
    - Native mode workspaces now skip binary launch and are immediately ready
  • v0.5.0: Binary-Free Mode - No OpenCode binary required
     Major Features:
    - Native session management without OpenCode binary
    - Provider routing: OpenCode Zen (free), Qwen OAuth, Z.AI
    - Streaming chat with tool execution loop
    - Mode detection API (/api/meta/mode)
    - MCP integration fix (resolved infinite loading)
    - NomadArch Native option in UI with comparison info
    
    🆓 Free Models (No API Key):
    - GPT-5 Nano (400K context)
    - Grok Code Fast 1 (256K context)
    - GLM-4.7 (205K context)
    - Doubao Seed Code (256K context)
    - Big Pickle (200K context)
    
    📦 New Files:
    - session-store.ts: Native session persistence
    - native-sessions.ts: REST API for sessions
    - lite-mode.ts: UI mode detection client
    - native-sessions.ts (UI): SolidJS store
    
    🔧 Updated:
    - All installers: Optional binary download
    - All launchers: Mode detection display
    - Binary selector: Added NomadArch Native option
    - README: Binary-Free Mode documentation
  • Fix UI freeze by adding yield to SSE streaming loop
    Added microtask yield (setTimeout 0) after processing each batch of SSE
    lines. This allows the main thread event loop to process UI updates and
    user interaction between streaming updates, preventing the UI from
    becoming completely unresponsive during rapid streaming.
  • restore: bring back all custom UI enhancements from checkpoint
    Restored from commit 52be710 (checkpoint before qwen oauth + todo roller):
    
    Enhanced UI Features:
    - SMART FIX button with AI code analysis
    - APEX (Autonomous Programming EXecution) mode
    - SHIELD (Auto-approval) mode
    - MULTIX MODE multi-task pipeline interface
    - Live streaming token counter
    - Thinking indicator with bouncing dots animation
    
    Components restored:
    - packages/ui/src/components/chat/multi-task-chat.tsx
    - packages/ui/src/components/instance/instance-shell2.tsx
    - packages/ui/src/components/settings/OllamaCloudSettings.tsx
    - packages/ui/src/components/settings/QwenCodeSettings.tsx
    - packages/ui/src/stores/solo-store.ts
    - packages/ui/src/stores/task-actions.ts
    - packages/ui/src/stores/session-events.ts (autonomous mode)
    - packages/server/src/integrations/ollama-cloud.ts
    - packages/server/src/server/routes/ollama.ts
    - packages/server/src/server/routes/qwen.ts
    
    This ensures all custom features are preserved in source control.
  • fix: restore complete source code and fix launchers
    - Copy complete source code packages from original CodeNomad project
    - Add root package.json with npm workspace configuration
    - Include electron-app, server, ui, tauri-app, and opencode-config packages
    - Fix Launch-Windows.bat and Launch-Dev-Windows.bat to work with correct npm scripts
    - Fix Launch-Unix.sh to work with correct npm scripts
    - Launchers now correctly call npm run dev:electron which launches Electron app