Commit Graph

8 Commits

  • Integrate Context-Engine RAG service for enhanced LLM responses
    Backend:
    - Created context-engine/client.ts - HTTP client for Context-Engine API
    - Created context-engine/service.ts - Lifecycle management of Context-Engine sidecar
    - Created context-engine/index.ts - Module exports
    - Created server/routes/context-engine.ts - API endpoints for status/health/query
    
    Integration:
    - workspaces/manager.ts: Trigger indexing when workspace becomes ready (non-blocking)
    - index.ts: Initialize ContextEngineService on server start (lazy mode)
    - ollama-cloud.ts: Inject RAG context into chat requests when available
    
    Frontend:
    - model-selector.tsx: Added Context-Engine status indicator
      - Green dot = Ready (RAG enabled)
      - Blue pulsing dot = Indexing
      - Red dot = Error
      - Hidden when Context-Engine not running
    
    All operations are non-blocking with graceful fallback when Context-Engine is unavailable.
  • Add server-side timeout handling to Ollama Cloud streaming
    - Added 60 second timeout per chunk in parseStreamingResponse
    - Added 120 second timeout to makeRequest with AbortController
    - This prevents the server from hanging indefinitely on slow/unresponsive API
    
    This should fix the UI freeze when sending messages to Ollama Cloud models.
  • feat: integrate Z.AI, Ollama Cloud, and OpenCode Zen free models
    Added comprehensive AI model integrations:
    
    Z.AI Integration:
    - Client with Anthropic-compatible API (GLM Coding Plan)
    - Routes for config, testing, and streaming chat
    - Settings UI component with API key management
    
    OpenCode Zen Integration:
    - Free models client using 'public' API key
    - Dynamic model fetching from models.dev
    - Supports GPT-5 Nano, Big Pickle, Grok Code Fast 1, MiniMax M2.1
    - No API key required for free tier!
    
    UI Enhancements:
    - Added Free Models tab (first position) in Advanced Settings
    - Z.AI tab with GLM Coding Plan info
    - OpenCode Zen settings with model cards and status
    
    All integrations work standalone without opencode.exe dependency.
  • restore: bring back all custom UI enhancements from checkpoint
    Restored from commit 52be710 (checkpoint before qwen oauth + todo roller):
    
    Enhanced UI Features:
    - SMART FIX button with AI code analysis
    - APEX (Autonomous Programming EXecution) mode
    - SHIELD (Auto-approval) mode
    - MULTIX MODE multi-task pipeline interface
    - Live streaming token counter
    - Thinking indicator with bouncing dots animation
    
    Components restored:
    - packages/ui/src/components/chat/multi-task-chat.tsx
    - packages/ui/src/components/instance/instance-shell2.tsx
    - packages/ui/src/components/settings/OllamaCloudSettings.tsx
    - packages/ui/src/components/settings/QwenCodeSettings.tsx
    - packages/ui/src/stores/solo-store.ts
    - packages/ui/src/stores/task-actions.ts
    - packages/ui/src/stores/session-events.ts (autonomous mode)
    - packages/server/src/integrations/ollama-cloud.ts
    - packages/server/src/server/routes/ollama.ts
    - packages/server/src/server/routes/qwen.ts
    
    This ensures all custom features are preserved in source control.
  • restore: recover deleted documentation, CI/CD, and infrastructure files
    Restored from origin/main (b4663fb):
    - .github/ workflows and issue templates
    - .gitignore (proper exclusions)
    - .opencode/agent/web_developer.md
    - AGENTS.md, BUILD.md, PROGRESS.md
    - dev-docs/ (9 architecture/implementation docs)
    - docs/screenshots/ (4 UI screenshots)
    - images/ (CodeNomad icons)
    - package-lock.json (dependency lockfile)
    - tasks/ (25+ project task files)
    
    Also restored original source files that were modified:
    - packages/ui/src/App.tsx
    - packages/ui/src/lib/logger.ts
    - packages/ui/src/stores/instances.ts
    - packages/server/src/server/routes/workspaces.ts
    - packages/server/src/workspaces/manager.ts
    - packages/server/src/workspaces/runtime.ts
    - packages/server/package.json
    
    Kept new additions:
    - Install-*.bat/.sh (enhanced installers)
    - Launch-*.bat/.sh (new launchers)
    - README.md (SEO optimized with GLM 4.7)
  • fix: restore complete source code and fix launchers
    - Copy complete source code packages from original CodeNomad project
    - Add root package.json with npm workspace configuration
    - Include electron-app, server, ui, tauri-app, and opencode-config packages
    - Fix Launch-Windows.bat and Launch-Dev-Windows.bat to work with correct npm scripts
    - Fix Launch-Unix.sh to work with correct npm scripts
    - Launchers now correctly call npm run dev:electron which launches Electron app