Files
OpenQode/bin/goose-ultra-final
Gemini AI 4230159036 fix: Enable Ollama models for code generation
- Added getActiveModel() function to automationService for dynamic model selection
- Replaced all hardcoded 'qwen-coder-plus' strings with getActiveModel() calls
- Added localStorage sync when model is changed in orchestrator
- This enables Ollama and other models to work in all automation tasks
4230159036 · 2025-12-20 13:56:21 +04:00
History
..

GHBanner

Run and deploy your AI Studio app

This contains everything you need to run your app locally.

View your app in AI Studio: https://ai.studio/apps/drive/12OdXUKxlvepe5h8CMj5H0ih_7lE9H239

Run Locally

Prerequisites: Node.js

  1. Install dependencies: npm install
  2. Set the GEMINI_API_KEY in .env.local to your Gemini API key
  3. Run the app: npm run dev