Files
nanobot-ts/memory-bank/activeContext.md
2026-03-13 21:16:00 -06:00

2.9 KiB

Active Context

Current Focus

Docs directory created with 4 files (PRD.md, Architecture.md, API.md, Discoveries.md). All source files previously written and verified — typecheck and lint are both clean.

Session State (as of this writing)

  • All source files complete and passing tsc --noEmit (0 errors) and oxlint (0 errors, 0 warnings)
  • package.json scripts added: start, dev, typecheck
  • Ready for runtime / integration testing

Key Fixes Applied This Session

  • Zod v4 .default(): nested object schemas need factory functions returning full output types (e.g. .default(() => ({ field: value, ... })))
  • Zod v4 z.record(): requires two args: z.record(z.string(), z.unknown())
  • AI SDK v6: LanguageModelV2LanguageModel; maxTokensmaxOutputTokens; maxStepsstopWhen: stepCountIs(1); usage fields: inputTokens / outputTokens (not promptTokens / completionTokens)
  • ollama-ai-provider v1.2.0: returns LanguageModelV1 — cast with as unknown as LanguageModel
  • js-tiktoken: get_encodinggetEncoding
  • web.ts: Document global not available in Bun/Node — return Record<string, unknown> from makePseudoDocument, cast at call site
  • loop.ts: syntax error ContextBuilder(workspace: ...)new ContextBuilder(opts.workspace)
  • lint: all ${err} in template literals → ${String(err)}; String(args['key'] ?? '')strArg(args, 'key') helper; unused onProgress param → _onProgress; WebSocket onerror err type is Event → use err.type

Work Queue (next steps)

  1. Create workspace helper module (src/cli/utils.ts) with ensureWorkspace() and syncTemplates()
  2. Create onboard command (src/cli/onboard.ts) with path argument and directory-not-empty guard
  3. Agent/gateway commands check workspace exists (throw if not found)
  4. Added required provider field to agent config (values: anthropic, openai, google, openrouter, ollama)
  5. Provider resolution uses explicit provider from config (no model prefix parsing)
  6. Typecheck and lint pass (0 errors)
  7. Test onboard and agent commands work correctly
  8. Updated Ollama provider from ollama-ai-provider to ai-sdk-ollama
  9. Test with a real Mattermost config (optional — user can do this)

Key Decisions Made

  • Mattermost channel uses raw WebSocket + fetch (no mattermostdriver, no SSL hack)
  • No MCP support (use shell tools / CLI instead)
  • No reasoning/thinking token handling (can add later)
  • Config is fresh Zod schema (no migration from Python config needed)
  • ai-sdk-ollama package for Ollama provider (replaced old ollama-ai-provider)
  • strArg(args, key, fallback?) helper exported from agent/tools/base.ts for safe unknown→string extraction
  • Agent config requires explicit provider field (no more model prefix like "anthropic/claude-...")
  • Model names are now just the raw model ID (e.g., "claude-sonnet-4-5" not "anthropic/claude-sonnet-4-5")