This example demonstrates the Convex Durable Agents component with a chat interface.
- Thread-based conversations with AI
- Tool execution (weather lookup)
- Real-time streaming responses
- Status indicators and controls (stop, retry)
- Thread management (create, delete, list)
- Install dependencies:
npm install- Set up your Convex project:
npx convex dev- Configure your AI model:
Edit convex/chat.ts and replace the mock model with your actual AI model:
// Example with OpenAI:
import { openai } from "@ai-sdk/openai";
const model = openai("gpt-4o");
// Example with Anthropic:
import { anthropic } from "@ai-sdk/anthropic";
const model = anthropic("claude-sonnet-4-20250514");- Set your API keys in the Convex dashboard or
.env.local:
OPENAI_API_KEY=your-key-here
# or
ANTHROPIC_API_KEY=your-key-here
- Run the development server:
npm run devexample/
├── convex/
│ ├── chat.ts # Agent definition and API
│ ├── tools/
│ │ └── weather.ts # Tool implementations
│ ├── schema.ts # App schema (empty - component manages tables)
│ └── convex.config.ts # Component registration
└── src/
├── App.tsx # Chat UI
└── main.tsx # Entry point
- Click "New Chat" to start a conversation
- Ask about the weather: "What's the weather in San Francisco?"
- The agent will use the weather tool to fetch data
- Watch the streaming response and tool execution status
- Use Stop/Retry buttons to control the conversation
get_weather- Returns weather conditions and temperature for a cityget_temperature- Async tool demonstrating delayed results
- The weather tool returns mock data for demonstration
- In production, replace with actual API calls
- The component handles all durability concerns automatically