Skip to content

Generators

Generators wrap the AI SDK and provide consistent generate and stream APIs. They accept either a LanguageModel instance or a provider/model connection string.

class Generator {
constructor(config: GeneratorConfig);
get modelId(): string;
generate(messages: Message[], options?: GenerateOptions): Promise<GenerateResult>;
stream(messages: Message[], options?: GenerateOptions): AsyncGenerator<StreamChunk>;
}
function createGenerator(model: LanguageModel, defaults?: GenerateOptions): Generator;
function createGenerator(connectionString: string, defaults?: GenerateOptions): Promise<Generator>;
function registerProvider(name: string, provider: unknown): void;
import { anthropic } from '@ai-sdk/anthropic';
import { createGenerator, Message } from '@dreadnode/agents';
const generator = createGenerator(anthropic('claude-sonnet-4-20250514'));
const result = await generator.generate([
Message.system('You are a concise assistant.'),
Message.user('Give me a one-sentence summary of Dreadnode.'),
]);
console.log(result.message.text);
import { anthropic } from '@ai-sdk/anthropic';
import { createGenerator, Message } from '@dreadnode/agents';
const generator = createGenerator(anthropic('claude-sonnet-4-20250514'));
for await (const chunk of generator.stream([
Message.user('Stream a short response about agent evaluations.'),
])) {
if (chunk.type === 'text-delta') {
process.stdout.write(chunk.textDelta ?? '');
}
}

Use registerProvider to add your own provider instance (for example, an OpenAI-compatible endpoint via the AI SDK). Once registered, you can use it in connection strings.

import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { createGenerator, registerProvider } from '@dreadnode/agents';
const vllm = createOpenAICompatible({
name: 'vllm',
baseURL: 'http://localhost:8000/v1',
});
registerProvider('vllm', vllm);
const generator = await createGenerator('vllm/llama-3-70b');