Files
SuperCharged-Claude-Code-Up…/dexto/agents/examples/ollama.yml
admin b52318eeae feat: Add intelligent auto-router and enhanced integrations
- Add intelligent-router.sh hook for automatic agent routing
- Add AUTO-TRIGGER-SUMMARY.md documentation
- Add FINAL-INTEGRATION-SUMMARY.md documentation
- Complete Prometheus integration (6 commands + 4 tools)
- Complete Dexto integration (12 commands + 5 tools)
- Enhanced Ralph with access to all agents
- Fix /clawd command (removed disable-model-invocation)
- Update hooks.json to v5 with intelligent routing
- 291 total skills now available
- All 21 commands with automatic routing

🤖 Generated with [Claude Code](https://claude.com/claude-code)

Co-Authored-By: Claude <noreply@anthropic.com>
2026-01-28 00:27:56 +04:00

68 lines
1.8 KiB
YAML

# describes the mcp servers to use
mcpServers:
filesystem:
type: stdio
command: npx
args:
- -y
- "@modelcontextprotocol/server-filesystem"
- .
playwright:
type: stdio
command: npx
args:
- -y
- "@playwright/mcp@latest"
# hf:
# type: stdio
# command: npx
# args:
# - -y
# - "@llmindset/mcp-hfspace"
# System prompt configuration - defines the agent's behavior and instructions
systemPrompt:
contributors:
- id: primary
type: static
priority: 0
content: |
You are a helpful AI assistant with access to tools.
Use these tools when appropriate to answer user queries.
You can use multiple tools in sequence to solve complex problems.
After each tool result, determine if you need more information or can provide a final answer.
- id: date
type: dynamic
priority: 10
source: date
enabled: true
# first start the ollama server
# ollama run gemma3n:e2b
# then run the following command to start the agent:
# dexto --agent <path_to_ollama.yml>
# dexto --agent <path_to_ollama.yml> for web ui
llm:
provider: openai-compatible
model: gemma3n:e2b
baseURL: http://localhost:11434/v1
apiKey: $OPENAI_API_KEY
maxInputTokens: 32768
# Storage configuration - uses a two-tier architecture: cache (fast, ephemeral) and database (persistent, reliable)
# Memory cache with file-based database (good for development with persistence)
# storage:
# cache:
# type: in-memory
# database:
# type: sqlite
# path: ./data/dexto.db
## To use Google Gemini, replace the LLM section with Google Gemini configuration below
## Similar for anthropic/groq/etc.
# llm:
# provider: google
# model: gemini-2.0-flash
# apiKey: $GOOGLE_GENERATIVE_AI_API_KEY