2.9 KiB
2.9 KiB
Active Context
Current Focus
Docs directory created with 4 files (PRD.md, Architecture.md, API.md, Discoveries.md). All source files previously written and verified — typecheck and lint are both clean.
Session State (as of this writing)
- All source files complete and passing
tsc --noEmit(0 errors) andoxlint(0 errors, 0 warnings) package.jsonscripts added:start,dev,typecheck- Ready for runtime / integration testing
Key Fixes Applied This Session
- Zod v4
.default(): nested object schemas need factory functions returning full output types (e.g..default(() => ({ field: value, ... }))) - Zod v4
z.record(): requires two args:z.record(z.string(), z.unknown()) - AI SDK v6:
LanguageModelV2→LanguageModel;maxTokens→maxOutputTokens;maxSteps→stopWhen: stepCountIs(1); usage fields:inputTokens/outputTokens(notpromptTokens/completionTokens) - ollama-ai-provider v1.2.0: returns
LanguageModelV1— cast withas unknown as LanguageModel js-tiktoken:get_encoding→getEncodingweb.ts:Documentglobal not available in Bun/Node — returnRecord<string, unknown>frommakePseudoDocument, cast at call siteloop.ts: syntax errorContextBuilder(workspace: ...)→new ContextBuilder(opts.workspace)- lint: all
${err}in template literals →${String(err)};String(args['key'] ?? '')→strArg(args, 'key')helper; unusedonProgressparam →_onProgress; WebSocketonerrorerrtype isEvent→ useerr.type
Work Queue (next steps)
- Create workspace helper module (src/cli/utils.ts) with ensureWorkspace() and syncTemplates()
- Create onboard command (src/cli/onboard.ts) with path argument and directory-not-empty guard
- Agent/gateway commands check workspace exists (throw if not found)
- Added required
providerfield to agent config (values: anthropic, openai, google, openrouter, ollama) - Provider resolution uses explicit provider from config (no model prefix parsing)
- Typecheck and lint pass (0 errors)
- Test onboard and agent commands work correctly
- Updated Ollama provider from
ollama-ai-providertoai-sdk-ollama - Test with a real Mattermost config (optional — user can do this)
Key Decisions Made
- Mattermost channel uses raw WebSocket + fetch (no mattermostdriver, no SSL hack)
- No MCP support (use shell tools / CLI instead)
- No reasoning/thinking token handling (can add later)
- Config is fresh Zod schema (no migration from Python config needed)
ai-sdk-ollamapackage for Ollama provider (replaced oldollama-ai-provider)strArg(args, key, fallback?)helper exported fromagent/tools/base.tsfor safe unknown→string extraction- Agent config requires explicit
providerfield (no more model prefix like "anthropic/claude-...") - Model names are now just the raw model ID (e.g., "claude-sonnet-4-5" not "anthropic/claude-sonnet-4-5")