End-to-end demo with a Next.js frontend and an Express/Drizzle backend that runs realtime OpenAI audio interviews and queues evaluations.
- Node.js 20+
- npm (bundled with Node)
- Docker + Docker Compose (for Postgres + Redis)
- OpenAI API key with realtime access
- S3-compatible storage (AWS S3 or MinIO) for recordings
- Start infrastructure
cd backend docker compose up -d - Configure environment
- Copy
backend/.env.exampletobackend/.envand fill secrets. - Copy
frontend/.env.local.exampletofrontend/.env.localand set the API base.
- Copy
- Install deps
cd backend && npm install cd ../frontend && npm install
- Push schema (run from
backendafter Postgres is up)npm run db:push
- Run services
- Backend (port 4000):
cd backend npm run dev - Frontend (port 3000):
cd frontend npm run dev
- Backend (port 4000):
- Use the app
- Open http://localhost:3000
- Sign up or sign in, create a Job template, then start an Interview.
DATABASE_URL=postgres://user:password@localhost:5432/interviewer_db
PORT=4000
FRONTEND_URL=http://localhost:3000
OPENAI_API_KEY=sk-...
REDIS_URL=redis://localhost:6379
S3_REGION=us-east-1
S3_BUCKET_NAME=ai-interviewer-recordings
S3_ACCESS_KEY_ID=your-key
S3_SECRET_ACCESS_KEY=your-secret
# Optional for MinIO/local stacks
S3_ENDPOINT=http://localhost:9000
EMAIL_ENABLED=false
SMTP_HOST=smtp.example.com
SMTP_PORT=587
SMTP_SECURE=false
SMTP_USER=your-smtp-user
SMTP_PASS=your-smtp-pass
EMAIL_FROM_NAME=AI Interviewer
EMAIL_FROM=no-reply@example.com
NEXT_PUBLIC_API_URL=http://localhost:4000/api
- Uses better-auth with Drizzle adapter and cookie sessions.
- Frontend talks to the backend auth endpoints at
${NEXT_PUBLIC_API_URL}/auth. - CORS is restricted to
http://localhost:3000; keep frontend origin aligned.
- Starts with a Job template; hitting Start Interview calls
/api/interviews/{id}/sessionto create an OpenAI realtime session and an S3 presigned upload URL. - Frontend captures mic/camera, opens a WebRTC connection to OpenAI, and streams audio both ways. Feedback updates live via tool calls.
- Recording is uploaded to S3/compatible storage after the interview finishes.
- Finalize call posts the transcript and sets status to
PROCESSING, then enqueues a BullMQ job on Redis. evaluation.workerruns in-process with the API server, calls OpenAI for a JSON evaluation, updates the interview toCOMPLETED, and emails the candidate when enabled.- Results poll on
/interview/{id}/resultuntil the evaluation is ready; video is fetched via a presigned URL when available.
- Ensure
NEXT_PUBLIC_API_URLmatches the backend (e.g.,http://localhost:4000/api). - Postgres + Redis from
docker compose up -dmust be running before starting the backend. - S3 bucket must exist and allow the provided credentials; for MinIO, set
S3_ENDPOINTand enable path-style access. - OpenAI key must have realtime + chat access for the configured models.