26 Commits

  • fix: resolve Antigravity token definition error and add validation
    Changes:
    1. Exported getStoredAntigravityToken and isAntigravityTokenValid from session-api.ts
    2. Imported token helpers into session-actions.ts
    3. Added token validation and user notifications to streamAntigravityChat
    4. Fixed TypeScript implicit any error in fetchAntigravityProvider
  • fix: prevent duplicate AI models in selector and fix TypeScript errors
    Changes:
    1. Enhanced removeDuplicateProviders() to filter out duplicate providers from SDK
       when the same provider exists in extras (qwen-oauth, zai, ollama-cloud, antigravity)
    2. Added logic to remove any Qwen-related SDK providers when qwen-oauth is authenticated
    3. Fixed missing setActiveParentSession import in instance-shell2.tsx
    
    These changes ensure:
    - No duplicate models appear in the model selector
    - Qwen OAuth models don't duplicate with any SDK Qwen providers
    - TypeScript compilation passes successfully
  • fix: complete session persistence overhaul (Codex 5.2)
    1. Implemented auto-selection of tasks in MultiXV2 to prevent empty initial state.
    2. Added force-loading logic for task session messages with debouncing.
    3. Updated session-actions to return full assistant text and immediately persist native messages.
    4. Fixed caching logic in instance-shell2 to retain active task sessions in memory.
  • feat: Add Google Device Authorization Flow for Antigravity native mode
    - Implemented proper OAuth device flow using gcloud CLI client ID
    - Added /api/antigravity/device-auth/start endpoint
    - Added /api/antigravity/device-auth/poll endpoint with polling
    - Added /api/antigravity/device-auth/refresh for token renewal
    - Updated AntigravitySettings UI with user code display
    - Auto-opens Google sign-in page and polls for completion
    - Seamless authentication experience matching SDK mode
  • feat: Add Antigravity provider integration + fix native mode startup
    - Added Antigravity AI provider with Google OAuth authentication
    - New integration client (antigravity.ts) with automatic endpoint fallback
    - API routes for /api/antigravity/* (models, auth-status, test, chat)
    - AntigravitySettings.tsx for Advanced Settings panel
    - Updated session-api.ts and session-actions.ts for provider routing
    - Updated opencode.jsonc with Antigravity plugin and 11 models:
      - Gemini 3 Pro Low/High, Gemini 3 Flash
      - Claude Sonnet 4.5 (+ thinking variants)
      - Claude Opus 4.5 (+ thinking variants)
      - GPT-OSS 120B Medium
    
    - Fixed native mode startup error (was trying to launch __nomadarch_native__ as binary)
    - Native mode workspaces now skip binary launch and are immediately ready
  • v0.5.0: Binary-Free Mode - No OpenCode binary required
     Major Features:
    - Native session management without OpenCode binary
    - Provider routing: OpenCode Zen (free), Qwen OAuth, Z.AI
    - Streaming chat with tool execution loop
    - Mode detection API (/api/meta/mode)
    - MCP integration fix (resolved infinite loading)
    - NomadArch Native option in UI with comparison info
    
    🆓 Free Models (No API Key):
    - GPT-5 Nano (400K context)
    - Grok Code Fast 1 (256K context)
    - GLM-4.7 (205K context)
    - Doubao Seed Code (256K context)
    - Big Pickle (200K context)
    
    📦 New Files:
    - session-store.ts: Native session persistence
    - native-sessions.ts: REST API for sessions
    - lite-mode.ts: UI mode detection client
    - native-sessions.ts (UI): SolidJS store
    
    🔧 Updated:
    - All installers: Optional binary download
    - All launchers: Mode detection display
    - Binary selector: Added NomadArch Native option
    - README: Binary-Free Mode documentation
  • Integrate Context-Engine RAG service for enhanced LLM responses
    Backend:
    - Created context-engine/client.ts - HTTP client for Context-Engine API
    - Created context-engine/service.ts - Lifecycle management of Context-Engine sidecar
    - Created context-engine/index.ts - Module exports
    - Created server/routes/context-engine.ts - API endpoints for status/health/query
    
    Integration:
    - workspaces/manager.ts: Trigger indexing when workspace becomes ready (non-blocking)
    - index.ts: Initialize ContextEngineService on server start (lazy mode)
    - ollama-cloud.ts: Inject RAG context into chat requests when available
    
    Frontend:
    - model-selector.tsx: Added Context-Engine status indicator
      - Green dot = Ready (RAG enabled)
      - Blue pulsing dot = Indexing
      - Red dot = Error
      - Hidden when Context-Engine not running
    
    All operations are non-blocking with graceful fallback when Context-Engine is unavailable.
  • Add server-side timeout handling to Ollama Cloud streaming
    - Added 60 second timeout per chunk in parseStreamingResponse
    - Added 120 second timeout to makeRequest with AbortController
    - This prevents the server from hanging indefinitely on slow/unresponsive API
    
    This should fix the UI freeze when sending messages to Ollama Cloud models.
  • feat: integrate Z.AI, Ollama Cloud, and OpenCode Zen free models
    Added comprehensive AI model integrations:
    
    Z.AI Integration:
    - Client with Anthropic-compatible API (GLM Coding Plan)
    - Routes for config, testing, and streaming chat
    - Settings UI component with API key management
    
    OpenCode Zen Integration:
    - Free models client using 'public' API key
    - Dynamic model fetching from models.dev
    - Supports GPT-5 Nano, Big Pickle, Grok Code Fast 1, MiniMax M2.1
    - No API key required for free tier!
    
    UI Enhancements:
    - Added Free Models tab (first position) in Advanced Settings
    - Z.AI tab with GLM Coding Plan info
    - OpenCode Zen settings with model cards and status
    
    All integrations work standalone without opencode.exe dependency.
  • restore: bring back all custom UI enhancements from checkpoint
    Restored from commit 52be710 (checkpoint before qwen oauth + todo roller):
    
    Enhanced UI Features:
    - SMART FIX button with AI code analysis
    - APEX (Autonomous Programming EXecution) mode
    - SHIELD (Auto-approval) mode
    - MULTIX MODE multi-task pipeline interface
    - Live streaming token counter
    - Thinking indicator with bouncing dots animation
    
    Components restored:
    - packages/ui/src/components/chat/multi-task-chat.tsx
    - packages/ui/src/components/instance/instance-shell2.tsx
    - packages/ui/src/components/settings/OllamaCloudSettings.tsx
    - packages/ui/src/components/settings/QwenCodeSettings.tsx
    - packages/ui/src/stores/solo-store.ts
    - packages/ui/src/stores/task-actions.ts
    - packages/ui/src/stores/session-events.ts (autonomous mode)
    - packages/server/src/integrations/ollama-cloud.ts
    - packages/server/src/server/routes/ollama.ts
    - packages/server/src/server/routes/qwen.ts
    
    This ensures all custom features are preserved in source control.
  • restore: recover deleted documentation, CI/CD, and infrastructure files
    Restored from origin/main (b4663fb):
    - .github/ workflows and issue templates
    - .gitignore (proper exclusions)
    - .opencode/agent/web_developer.md
    - AGENTS.md, BUILD.md, PROGRESS.md
    - dev-docs/ (9 architecture/implementation docs)
    - docs/screenshots/ (4 UI screenshots)
    - images/ (CodeNomad icons)
    - package-lock.json (dependency lockfile)
    - tasks/ (25+ project task files)
    
    Also restored original source files that were modified:
    - packages/ui/src/App.tsx
    - packages/ui/src/lib/logger.ts
    - packages/ui/src/stores/instances.ts
    - packages/server/src/server/routes/workspaces.ts
    - packages/server/src/workspaces/manager.ts
    - packages/server/src/workspaces/runtime.ts
    - packages/server/package.json
    
    Kept new additions:
    - Install-*.bat/.sh (enhanced installers)
    - Launch-*.bat/.sh (new launchers)
    - README.md (SEO optimized with GLM 4.7)
  • fix: restore complete source code and fix launchers
    - Copy complete source code packages from original CodeNomad project
    - Add root package.json with npm workspace configuration
    - Include electron-app, server, ui, tauri-app, and opencode-config packages
    - Fix Launch-Windows.bat and Launch-Dev-Windows.bat to work with correct npm scripts
    - Fix Launch-Unix.sh to work with correct npm scripts
    - Launchers now correctly call npm run dev:electron which launches Electron app