feat: create onboard script
This commit is contained in:
@@ -19,9 +19,13 @@ Docs directory created with 4 files (PRD.md, Architecture.md, API.md, Discoverie
|
||||
- **lint**: all `${err}` in template literals → `${String(err)}`; `String(args['key'] ?? '')` → `strArg(args, 'key')` helper; unused `onProgress` param → `_onProgress`; WebSocket `onerror` `err` type is `Event` → use `err.type`
|
||||
|
||||
## Work Queue (next steps)
|
||||
1. [ ] Runtime smoke test: `bun run start --help`
|
||||
2. [ ] Test with a real Mattermost config (optional — user can do this)
|
||||
3. [ ] Write sample `~/.nanobot/config.json` in README or docs
|
||||
1. [x] Create workspace helper module (src/workspace.ts) with ensureWorkspace() and syncTemplates()
|
||||
2. [x] Create onboard command (src/cli/onboard.ts) with path argument and directory-not-empty guard
|
||||
3. [x] Update src/cli/commands.ts to use ensureWorkspace() instead of inline mkdirSync
|
||||
4. [x] Typecheck and lint pass (0 errors)
|
||||
5. [x] Runtime smoke test: `bun run nanobot --help`
|
||||
6. [x] Test onboard command: `bun run nanobot onboard [path]`
|
||||
7. [ ] Test with a real Mattermost config (optional — user can do this)
|
||||
|
||||
## Key Decisions Made
|
||||
- Mattermost channel uses raw WebSocket + fetch (no mattermostdriver, no SSL hack)
|
||||
|
||||
@@ -39,10 +39,19 @@
|
||||
### 🔄 In Progress
|
||||
- Nothing
|
||||
|
||||
### ✅ Done
|
||||
- Created src/workspace.ts with ensureWorkspace(), syncTemplates(), checkWorkspaceEmpty()
|
||||
- Created src/cli/onboard.ts command with path argument
|
||||
- Updated src/cli/commands.ts to use ensureWorkspace() helper
|
||||
- Typecheck: 0 errors
|
||||
- Lint: 0 warnings
|
||||
|
||||
### 🔄 In Progress
|
||||
- Testing onboard command
|
||||
|
||||
### ⏳ Pending
|
||||
- Runtime smoke test: `bun run start --help`
|
||||
- Runtime smoke test: `bun run nanobot --help`
|
||||
- Integration test with a real Mattermost server
|
||||
- Sample `~/.nanobot/config.json` documentation
|
||||
|
||||
## Known Issues / Risks
|
||||
- `ollama-ai-provider` v1.2.0 returns `LanguageModelV1` (not V2/V3 as expected by AI SDK v6) — cast used at call site. Works at runtime.
|
||||
|
||||
Reference in New Issue
Block a user