- ConversationStore: per-chat JSON files in data/, survives restarts
- 6000 token budget per chat context (fits ~20-30 exchanges)
- Auto-trims old messages, always includes most recent
- Wired into message handler: loads history before AI call, saves after
- /reset command to clear chat history per chat
- Cross-session, cross-model, cross-chat isolation
- New memory.js: JSON-backed MemoryStore with 5 categories (lesson, pattern, preference, discovery, gotcha)
- Memory injected into system prompt — bot sees past learnings every session
- Curiosity engine: auto-detects errors/fixes, corrections, successful patterns, new tool discoveries
- New commands: /memory (stats), /remember (save), /recall (search), /forget (delete)
- Runs AFTER response delivery — zero latency impact
- 500 memory cap with smart eviction (keeps gotchas/lessons, evicts old discoveries)
- data/ directory gitignored (memory is local to each deployment)