Models and Configuration
Switch models in-session
Section titled “Switch models in-session”$ dreadnodedreadnode> /model anthropic/claude-sonnet-4-20250514Active model set to anthropic/claude-sonnet-4-20250514dreadnode> /model openai/gpt-4.1-miniActive model set to openai/gpt-4.1-miniConfigure a custom endpoint
Section titled “Configure a custom endpoint”Use -e to point the CLI at an OpenAI-compatible endpoint like Ollama.
$ dreadnode -e http://localhost:11434/v1 -m openai/gpt-4.1-minidreadnode> /configapi_endpoint: http://localhost:11434/v1Supported model providers
Section titled “Supported model providers”| Provider | Example model |
|---|---|
| Anthropic | anthropic/claude-sonnet-4-20250514 |
| OpenAI | openai/gpt-4.1-mini |
| google/gemini-2.0-flash | |
| Mistral | mistral/mistral-small |
| OpenRouter | openrouter/anthropic/claude-3.5-sonnet |
Command reference
Section titled “Command reference”| Command | Arguments | Description |
|---|---|---|
/model | [provider/model] | Switch the active model |
/config | — | Show or edit CLI configuration |
-m | <provider/model> | Select a model when starting the CLI |
-e | <endpoint> | Set a custom API endpoint when starting |
Configuration workflow
Section titled “Configuration workflow”Pick a default model
Section titled “Pick a default model”Set a default model during startup with -m, then keep switching in-session with /model.