createAgent
The core factory function that creates an AI agent backed by a Cloudflare Durable Object.
Signature
TypeScript
1function createAgent(config: AgentConfig): DurableObject
AgentConfig
| Field | Type | Required | Description |
|---|---|---|---|
name | string | Yes | Unique identifier for this agent type |
model | string | Yes | LLM model identifier (see supported models below) |
system | string | No | System prompt / instructions for the agent |
tools | Record<string, Tool> | No | Named tools the agent can invoke |
memory | MemoryConfig | 'tiered' | No | Memory tier configuration |
observability | ObservabilityConfig | No | Logging and tracing configuration |
Supported Models
Honi routes to the right provider automatically based on the model ID prefix. All non-core providers use optional peer deps — zero bundle cost unless installed.
| Provider | Prefix | Example model | Env var |
|---|---|---|---|
| Anthropic | claude-* | claude-sonnet-4-5 | ANTHROPIC_API_KEY |
| OpenAI | gpt-*, o1, o3-* | gpt-4o, o3-mini | OPENAI_API_KEY |
gemini-* | gemini-2.5-flash-preview | GOOGLE_AI_API_KEY | |
| Groq | groq/* | groq/llama-3.3-70b-versatile | GROQ_API_KEY |
| DeepSeek | deepseek-* | deepseek-chat, deepseek-reasoner | DEEPSEEK_API_KEY |
| Mistral | mistral-*, codestral-* | mistral-large-latest | MISTRAL_API_KEY |
| xAI | grok-* | grok-3, grok-3-mini | XAI_API_KEY |
| Perplexity | sonar* | sonar-pro, sonar-reasoning | PERPLEXITY_API_KEY |
| Together AI | together/* | together/meta-llama/Llama-3.3-70B-Instruct-Turbo | TOGETHER_API_KEY |
| Cohere | command-* | command-r-plus, command-a-03-2025 | COHERE_API_KEY |
| Azure OpenAI | azure/* | azure/gpt-4o | AZURE_OPENAI_API_KEY + AZURE_OPENAI_ENDPOINT |
| Workers AI | @cf/* | @cf/meta/llama-3.1-8b-instruct | AI binding (wrangler.toml) |
Non-core providers require their AI SDK package:
Shell
$npm install @ai-sdk/google # Google Gemini
$npm install @ai-sdk/groq # Groq
$npm install @ai-sdk/deepseek # DeepSeek
$npm install @ai-sdk/mistral # Mistral
$npm install @ai-sdk/xai # xAI
$npm install @ai-sdk/perplexity # Perplexity
$npm install @ai-sdk/togetherai # Together AI
$npm install @ai-sdk/cohere # Cohere
$npm install @ai-sdk/azure # Azure OpenAI
$npm install @ai-sdk/cloudflare # Workers AI
HTTP Endpoints
A Honi agent automatically exposes the following HTTP endpoints:
POST /chat
Send a message to the agent and receive a response.
JSON
// Request
{
"message": "What's the weather today?",
"sessionId": "user-123" // optional
}
// Response
{
"response": "I don't have access to weather data...",
"sessionId": "user-123",
"toolCalls": []
}
DELETE /chat
Reset the agent's conversation history for a given session.
JSON
{ "sessionId": "user-123" }
Streaming (SSE)
Add Accept: text/event-stream to your POST /chat request to receive Server-Sent Events. Tokens are streamed as they are generated by the LLM:
Shell
$curl -N -X POST http://localhost:8787/chat \
-H "Content-Type: application/json" \
-H "Accept: text/event-stream" \
-d '{"message": "Hello"}'
Each SSE event has a type field: token, tool_call, tool_result, or done.
Full Example
TypeScript
1import { createAgent, tool } from 'honidev'
2import { z } from 'zod'
3
4export const agent = createAgent({
5 name: 'support-bot',
6 model: 'claude-sonnet-4-20250514',
7 system: 'You are a customer support agent.',
8 memory: 'tiered',
9
10 tools: {
11 lookupOrder: tool({
12 description: 'Look up a customer order',
13 input: z.object({ orderId: z.string() }),
14 async run({ orderId }, ctx) {
15 return ctx.env.DB.prepare('SELECT * FROM orders WHERE id = ?')
16 .bind(orderId).first()
17 }
18 }),
19 },
20
21 observability: {
22 logLevel: 'info',
23 aiGateway: { accountId: '...', gatewayId: 'my-gw' }
24 }
25})