A high-performance HTTP/HTTPS proxy server built with Node.js and TypeScript. This proxy forwards requests to an origin server and caches GET responses in memory for improved performance.
- ✅ HTTP/HTTPS Support — Handles both HTTP and HTTPS protocols
- ✅ Request Forwarding — Proxies all HTTP methods (GET, POST, PUT, DELETE, etc.)
- ✅ In-Memory Caching — Caches GET requests with TTL expiration
- ✅ Streaming — Efficient memory usage with request/response streams
- ✅ Error Handling — Graceful error handling and reporting
- ✅ TypeScript — Fully typed for safety and IDE support
npm installnpm run buildnpm start -- --port <PORT> --origin <ORIGIN_URL>npm run dev -- --port <PORT> --origin <ORIGIN_URL>Forward to an HTTP API:
npm start -- --port 3000 --origin http://api.example.comForward to an HTTPS API:
npm start -- --port 3000 --origin https://api.example.comForward with a path:
npm start -- --port 3000 --origin https://api.example.com/v1Then make requests to the proxy:
curl http://localhost:3000/products
curl http://localhost:3000/users -X POST -d '{"name":"John"}'- Client Request → Proxy receives request
- Cache Check → If GET request is cached and not expired, return cached response
- Cache Miss → Forward request to origin server
- Buffer Response → Collect response body from origin
- Cache Store → Store response in memory (GET requests only)
- Send Response → Send response back to client with
x-cache: HITorx-cache: MISSheader
- Upload Stream (
req.pipe(forwardReq)) — Client request body → Origin server - Download Stream (
originRes.pipe(res)) — Origin response body → Client
This ensures memory-efficient handling of large files and responses.
- What's Cached — Only GET requests
- TTL — Default 5 minutes expiration
- Memory Limit — 50MB max cache size
- Headers — Response headers are preserved,
x-cacheheader added
| Option | Description | Example |
|---|---|---|
--port |
Port to listen on | --port 3000 |
--origin |
Origin server URL to proxy to | --origin http://api.example.com |
Both options are required.
| Header | Value | Meaning |
|---|---|---|
x-cache |
HIT |
Response served from cache |
x-cache |
MISS |
Response from origin server |
proxy-server/
├── src/
│ └── index.ts # Main proxy server implementation
├── dist/ # Compiled JavaScript (after build)
├── package.json # Node.js dependencies and scripts
├── tsconfig.json # TypeScript configuration
└── README.md # This file
npm run build # Compile TypeScript to JavaScript
npm start # Run the compiled proxy server
npm run dev # Run with TypeScript directly (auto-reload)- Node.js 18+
- npm or yarn
The proxy handles common errors gracefully:
- Port Already In Use — Exits with error message
- Invalid Origin — Exits with error message
- Connection Errors — Returns 500 error to client
- Missing Arguments — Exits with usage instructions
- Streaming — Uses Node.js streams to avoid loading entire responses into memory
- Caching — Reduces latency for repeated GET requests
- TTL Expiration — Prevents stale data accumulation
- Backpressure Handling — Automatic flow control via streams
- API Gateway — Proxy to multiple backend services
- Rate Limiting — Combine with rate limiting middleware
- Request Logging — Add logging for all requests
- CDN Alternative — Cache and serve frequently accessed resources
- Development — Proxy to remote APIs during local development
ISC
Rohit Singh