Files
NomadArch/packages
Gemini AI 743d0367e2 Integrate Context-Engine RAG service for enhanced LLM responses
Backend:
- Created context-engine/client.ts - HTTP client for Context-Engine API
- Created context-engine/service.ts - Lifecycle management of Context-Engine sidecar
- Created context-engine/index.ts - Module exports
- Created server/routes/context-engine.ts - API endpoints for status/health/query

Integration:
- workspaces/manager.ts: Trigger indexing when workspace becomes ready (non-blocking)
- index.ts: Initialize ContextEngineService on server start (lazy mode)
- ollama-cloud.ts: Inject RAG context into chat requests when available

Frontend:
- model-selector.tsx: Added Context-Engine status indicator
  - Green dot = Ready (RAG enabled)
  - Blue pulsing dot = Indexing
  - Red dot = Error
  - Hidden when Context-Engine not running

All operations are non-blocking with graceful fallback when Context-Engine is unavailable.
743d0367e2 · 2025-12-24 22:20:13 +04:00
History
..