Documentry is a AI-powered Typescript library that uses LLM models to understand your Next.js API routes and
automatically generate detailed OpenAPI documentation in multiple formats: json, yaml, and interactive html.
With a single terminal command, Documentry scans every API route in your Next.js project,
understand the actual code of your route.ts files, and generates a valid OpenAPI Specification (OAS) file that
describes your endpoints.
- 🚀 Automatically scams your project and detects your Next.js API routes
- 🧠 Uses AI to understand the actual code of your routes
- 📝 Creates
OpenAPI 3.0specifications injson,yaml, or interactivehtmlformat - 🔄 Currently supports OpenAI and Anthropic models
npm install documentry --save-devnpx documentry import { Documentry } from 'documentry'
// Create a new Documentry instance
const documentry = new Documentry()
// Generate OpenAPI specs
await documentry.generate()Full usage example
const documentry = new Documentry({
provider: 'anthropic',
model: 'claude-3-5-sonnet-latest',
apiKey: process.env.ANTHROPIC_API_KEY,
dir: './app/api',
routes: ['/user', '/products/*'],
outputFile: './docs/openapi',
format: 'html', // 'yaml', 'json', or 'html'
info: {
title: 'My API',
version: '1.0.0',
description: 'My API description'
},
servers: [
{
url: 'http://localhost:3000/api',
description: 'Local server'
},
{
url: 'https://api.example.com',
description: 'Production server'
}
]
})
await documentry.generate()An example of the generated OpenAPI documentation in HTML format:
More examples can be found in the examples directory.
You can configure the LLM settings with an .env file:
LLM_PROVIDER=your-llm-provider # openai or anthropic; defaults to anthropic
LLM_MODEL=your-llm-model # defaults to claude-3-5-sonnet-latest
ANTHROPIC_API_KEY=your-anthropic-key
OPENAI_API_KEY=your-openai-keyThe CLI usage supports the following options:
| Flag | Description | Default |
|---|---|---|
--dir <directory> |
Root directory for your Nextjs API routes (./app/api, ./src/app/api, etc) |
./app/api |
--routes <routes> |
List of routes to process (e.g., "/user,/products/*") | All routes are considered |
--servers <servers> |
List of server URLs (e.g. "url1|description1, url2...") | URL: http://localhost:3000/api |
-o, --output-file <file> |
Output folder/file for the generated OpenAPI specs | ./docs/openapi |
-f, --format |
The format for the generated OpenAPI file (yaml, json, or html) |
yaml |
-t, --title <title> |
Title for the OpenAPI spec | Next.js API |
-d, --description <description> |
Description for the OpenAPI spec | API documentation for Next.js routes |
-v, --version <version> |
Version for the OpenAPI spec | 1.0.0 |
-p, --provider <provider> |
LLM provider (anthropic or openai) |
Env variable LLM_PROVIDER |
-m, --model <model> |
LLM model to use | Env variable LLM_MODEL |
-k, --api-key <key> |
LLM provider API key | Env variable ANTHROPIC_API_KEY or OPENAI_API_KEY, according to the provider |
- Node.js >= 14.0.0
- npm >= 6.0.0
-
Clone the repository:
git clone https://github.com/thiagobarbosa/documentry cd documentry -
Install dependencies:
npm install
-
Build the project:
npm run build
-
Run in development mode:
npm run dev
This project is licensed under the MIT License.
