Skip to content

Models and Configuration

Terminal window
$ dreadnode
dreadnode> /model anthropic/claude-sonnet-4-20250514
Active model set to anthropic/claude-sonnet-4-20250514
dreadnode> /model openai/gpt-4.1-mini
Active model set to openai/gpt-4.1-mini

Use -e to point the CLI at an OpenAI-compatible endpoint like Ollama.

Terminal window
$ dreadnode -e http://localhost:11434/v1 -m openai/gpt-4.1-mini
dreadnode> /config
api_endpoint: http://localhost:11434/v1
ProviderExample model
Anthropicanthropic/claude-sonnet-4-20250514
OpenAIopenai/gpt-4.1-mini
Googlegoogle/gemini-2.0-flash
Mistralmistral/mistral-small
OpenRouteropenrouter/anthropic/claude-3.5-sonnet
CommandArgumentsDescription
/model[provider/model]Switch the active model
/configShow or edit CLI configuration
-m<provider/model>Select a model when starting the CLI
-e<endpoint>Set a custom API endpoint when starting

Set a default model during startup with -m, then keep switching in-session with /model.