fix: use updated ollama ai package

This commit is contained in:
Joe Fleming
2026-03-13 21:09:09 -06:00
parent 74a5e70322
commit c65a7160ba
12 changed files with 39 additions and 41 deletions

View File

@@ -26,14 +26,15 @@ Docs directory created with 4 files (PRD.md, Architecture.md, API.md, Discoverie
5. [x] Provider resolution uses explicit provider from config (no model prefix parsing)
6. [x] Typecheck and lint pass (0 errors)
7. [x] Test onboard and agent commands work correctly
8. [ ] Test with a real Mattermost config (optional — user can do this)
8. [x] Updated Ollama provider from `ollama-ai-provider` to `ai-sdk-ollama`
9. [ ] Test with a real Mattermost config (optional — user can do this)
## Key Decisions Made
- Mattermost channel uses raw WebSocket + fetch (no mattermostdriver, no SSL hack)
- No MCP support (use shell tools / CLI instead)
- No reasoning/thinking token handling (can add later)
- Config is fresh Zod schema (no migration from Python config needed)
- `ollama-ai-provider` package (not `@ai-sdk/ollama` which 404s on npm)
- `ai-sdk-ollama` package for Ollama provider (replaced old `ollama-ai-provider`)
- `strArg(args, key, fallback?)` helper exported from `agent/tools/base.ts` for safe unknown→string extraction
- Agent config requires explicit `provider` field (no more model prefix like "anthropic/claude-...")
- Model names are now just the raw model ID (e.g., "claude-sonnet-4-5" not "anthropic/claude-sonnet-4-5")