This example project demonstrates how to integrate ATXP's pay-per-use MCP (Model Context Protocol) tools with Vercel AI SDK for building AI applications. This example project accesses LLMs through the ATXP LLM Gateway. The LLM Gateway allows you to Access multiple LLM providers through a single, unified API without managing multiple accounts or API keys, giving you instant access to models from qwen, claude, deepseek, gemini, llama, gpt, grok, and more.
- LLM Gateway Integration: Access multiple LLM providers through ATXP's unified OpenAI-compatible API
- Image Generation: Create images using ATXP's image generation MCP server
- Web Search: Search for information using ATXP's search MCP server
- Streaming Responses: Leverage Vercel AI SDK for real-time AI interactions
- Pay-per-use: Only pay for what you use with ATXP's usage-based pricing
- Node.js 18.0.0 or higher
- An ATXP account with a connection string
- An OpenAI API key
-
Create your own copy of this repo using the template
-
Clone your newly created repo:
git clone git@github.com:your-github-user/your-new-repo cd your-new-repo -
Install the needed dependencies:
npm install
-
Copy the example environment file and add your credentials:
cp env.example .envEdit .env and add your credentials:
# Required for ATXP MCP tools (get from your ATXP dashboard)
# Create an ATXP Account at https://accounts.atxp.ai
ATXP_CONNECTION=https://accounts.atxp.ai?connection_token=<your_token>&account_id=<your_account_id>If you aren't using the ATXP LLM Gateway, you'll also need to specify your OpenAI API Key:
# Required for OpenAI integration
OPENAI_API_KEY=your_openai_api_key_hereImportant: Never commit your .env file to version control. It's already added to .gitignore.
-
Build the project
npm run build
-
Run the compiled project from the root of your repo: Use the ATXP Search MCP server to search the web:
node dist/index.js "provide me with the latest news about AI"Use the ATXP Image Generation MCP server to generate an image from a prompt:
node dist/index.js "create an image of a tree."
This demo integrates ATXP's MCP tools with Vercel AI SDK through the following process:
- ATXP Account Initialization: Creates an ATXP account using your connection string
- MCP Transport Setup: Builds streamable transports for ATXP's MCP servers (image generation and search)
- Tool Integration: Connects ATXP's MCP tools with Vercel AI SDK's experimental MCP client
- AI Processing: Uses OpenAI's GPT models through the ATXP LLM Gateway with the integrated tools to process user requests
- Response Generation: Returns structured responses with tool results
atxp-vercel-demo/
├── src/
│ └── index.ts # Main application logic
├── env.example # Environment variables template
├── package.json # Dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # This file
@atxp/client: ATXP SDK for MCP tool integrationai: Vercel AI SDK for streaming AI applications@ai-sdk/openai: OpenAI integration for Vercel AI SDKdotenv: Environment variable management
To add more ATXP MCP services:
- Add a new service configuration to the
SERVICESobject insrc/index.ts - Include the new service in the
servicesarray - Update the validation and help text as needed
- ATXP LLM Gateway - Access multiple LLM providers through a unified API
- ATXP Vercel AI SDK Integration Guide
- ATXP MCP Servers Documentation
- Vercel AI SDK Documentation
- Join the ATXP Community on Discord
- Check out the ATXP Documentation
This project is licensed under the MIT License - see the LICENSE.md file for details.