fix: require user to specify the provider
This commit is contained in:
46
README.md
46
README.md
@@ -18,13 +18,17 @@ bun install # or use `mise install`
|
||||
|
||||
## Quick start
|
||||
|
||||
**1. Create a config file**
|
||||
**1. Initialize workspace**
|
||||
|
||||
```bash
|
||||
mkdir -p ~/.config/nanobot
|
||||
bun run nanobot onboard
|
||||
```
|
||||
|
||||
`~/.config/nanobot/config.json`:
|
||||
This creates `~/.config/nanobot/` with a config file and templates.
|
||||
|
||||
**2. Edit config**
|
||||
|
||||
Add your API key and set provider/model:
|
||||
|
||||
```json
|
||||
{
|
||||
@@ -34,12 +38,13 @@ mkdir -p ~/.config/nanobot
|
||||
}
|
||||
},
|
||||
"agent": {
|
||||
"model": "openrouter/anthropic/claude-sonnet-4-5"
|
||||
"provider": "openrouter",
|
||||
"model": "anthropic/claude-sonnet-4-5"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**2. Chat**
|
||||
**3. Chat**
|
||||
|
||||
```bash
|
||||
bun run nanobot agent
|
||||
@@ -115,7 +120,8 @@ Environment variable overrides:
|
||||
```json
|
||||
{
|
||||
"agent": {
|
||||
"model": "openrouter/anthropic/claude-sonnet-4-5",
|
||||
"provider": "openrouter",
|
||||
"model": "anthropic/claude-sonnet-4-5",
|
||||
"workspacePath": "~/.config/nanobot",
|
||||
"maxTokens": 4096,
|
||||
"contextWindowTokens": 65536,
|
||||
@@ -164,17 +170,27 @@ Environment variable overrides:
|
||||
}
|
||||
```
|
||||
|
||||
### Providers
|
||||
### Provider
|
||||
|
||||
Model names use a `provider/model` prefix scheme:
|
||||
The `agent.provider` field is **required** and must be one of:
|
||||
|
||||
| Prefix | Provider | Example |
|
||||
|--------|----------|---------|
|
||||
| `anthropic/` | Anthropic direct | `anthropic/claude-opus-4-5` |
|
||||
| `openai/` | OpenAI direct | `openai/gpt-4o` |
|
||||
| `google/` | Google direct | `google/gemini-2.5-pro` |
|
||||
| `openrouter/` | OpenRouter (any model) | `openrouter/anthropic/claude-sonnet-4-5` |
|
||||
| `ollama/` | Local Ollama | `ollama/llama3.2` |
|
||||
| Provider | Description |
|
||||
|----------|-------------|
|
||||
| `anthropic` | Anthropic direct (Claude models) |
|
||||
| `openai` | OpenAI direct (GPT models) |
|
||||
| `google` | Google direct (Gemini models) |
|
||||
| `openrouter` | OpenRouter (access to many models) |
|
||||
| `ollama` | Local Ollama instance |
|
||||
|
||||
The `agent.model` field is also **required** and should be the model ID without any provider prefix:
|
||||
|
||||
| Provider | Example Model |
|
||||
|----------|---------------|
|
||||
| `anthropic` | `claude-sonnet-4-5`, `claude-opus-4-5` |
|
||||
| `openai` | `gpt-4o`, `gpt-4o-mini` |
|
||||
| `google` | `gemini-2.5-pro`, `gemini-2.0-flash` |
|
||||
| `openrouter` | `anthropic/claude-sonnet-4-5` (OpenRouter uses its own model IDs) |
|
||||
| `ollama` | `llama3.2`, `qwen2.5` |
|
||||
|
||||
For Ollama, set `providers.ollama.apiBase` (default: `http://localhost:11434/api`).
|
||||
|
||||
|
||||
Reference in New Issue
Block a user