Skip to content

An event-driven AI SaaS that repurposes videos into viral blogs, LinkedIn posts, and Twitter threads. Built with Next.js, NestJS, RabbitMQ, Deepgram, and DeepSeek.

Notifications You must be signed in to change notification settings

Viren0990/Content-Forge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Content Forge ⚒️

Automated AI Video Repurposing Platform

Content Forge is a production-grade, event-driven platform that automatically transforms raw video footage into polished social media content (LinkedIn posts, Twitter threads, and Blogs).

Built with a focus on scalability and resilience, it features a microservices architecture that handles large file uploads (1GB+) and heavy AI processing without blocking the main application.


🏗️ System Architecture

The system uses a Microservices Architecture with an Event Bus to decouple the API Gateway from the heavy AI processing worker.

Key Workflows

  1. Ingestion (Presigned URLs):
  • The API generates a secure, temporary S3 signature.
  • The Frontend uploads directly to MinIO/S3, bypassing the API server to prevent Node.js event loop blocking.
  1. Event Bus (RabbitMQ):
  • Once uploaded, the API pushes a job to RabbitMQ.
  • This ensures the API remains responsive even if 1000 users upload simultaneously.
  1. Processing (Worker Service):
  • A dedicated NestJS Microservice consumes jobs one by one (prefetch: 1).
  • It handles downloading, audio extraction (ffmpeg), transcription (Deepgram), and content generation (DeepSeek/OpenAI).
  1. Resilience (Self-Healing):
  • The worker includes a "Zombie Job Sweeper" that detects and fixes jobs that crashed mid-process.

🚀 Tech Stack

Frontend (apps/frontend)

  • Framework: Next.js 15 (App Router)
  • Styling: Tailwind CSS + Lucide Icons
  • State: React Server Actions + Optimistic Updates
  • Security: HTTP-Only Cookies (No LocalStorage tokens)

Backend (apps/api)

  • Framework: NestJS (Monolith Gateway)
  • Database: PostgreSQL (via Prisma ORM)
  • Queue: RabbitMQ (Message Broker)
  • Auth: JWT + Passport Strategy

Worker (apps/worker)

  • Type: NestJS Microservice
  • AI Audio: Deepgram Nova-2 (Streaming)
  • AI Text: DeepSeek V3 / OpenAI GPT-4
  • Processing: Fluent-FFmpeg

Infrastructure

  • Containerization: Docker & Docker Compose
  • Storage: MinIO (S3 Compatible Object Storage)

✨ Key Features

  • 🔒 Enterprise-Grade Auth: Secure HTTP-Only cookie management with automatic token refreshing via Middleware.

  • ⚡ Direct S3 Uploads: Capable of handling 1GB+ video files without stressing the backend server.

  • 🛡️ Resilient Workers:

  • Prefetch Count: Prevents worker overload by processing 1 job at a time.

  • Ack Strategy: Manual acknowledgement ensures no data loss on crashes.

  • Timeout Protection: Uses Promise.race to kill hung AI processes.

  • 🧹 Auto-Cleanup: Automatically removes temp files and raw S3 uploads after processing to save storage costs.

  • 📊 Real-Time Status: Polling mechanism with back-off strategy to update the UI from Uploading -> Processing -> Completed.


🛠️ Getting Started

Prerequisites

  • Docker & Docker Compose
  • Node.js 18+ (for local dev)
  • pnpm (recommended)

1. Environment Setup

Create a .env file in the root directory (or separate .env files for each app if preferred):

bash

Database

DATABASE_URL="postgresql://postgres:postgres@localhost:5432/content-forge"

RabbitMQ

RABBITMQ_URL="amqp://admin:admin@localhost:5672"

JWT

JWT_SECRET="super_secret_key"

AWS / MinIO

AWS_REGION="us-east-1" AWS_ENDPOINT="http://localhost:9000" AWS_ACCESS_KEY_ID="minioadmin" AWS_SECRET_ACCESS_KEY="minioadmin" AWS_BUCKET_NAME="content-forge-bucket"

AI Keys

DEEPGRAM_API_KEY="your_deepgram_key" OPENAI_API_KEY="your_openai_or_deepseek_key"

2. Start Infrastructure

Spin up RabbitMQ, and MinIO.

bash docker-compose up -d

3. Initialize Database

Run migrations to set up the Prisma schema.

bash cd apps/api npx prisma migrate dev --name init

4. Run the Stack

You can run all services via TurboRepo (if configured) or separate terminals:

Terminal 1 (API Gateway):

bash cd apps/api npm start:dev

Terminal 2 (Worker Service):

bash cd apps/worker npm start:dev

Terminal 3 (Frontend):

bash cd apps/frontend npm dev


🧪 Architecture Decisions (Why I built it this way)

Why not just upload to the API? Streaming a 1GB file through a Node.js server consumes massive amounts of RAM and blocks the single thread. Using Presigned URLs offloads this network strain directly to the Storage Service (MinIO/S3), allowing the API to remain lightweight.

Why RabbitMQ? HTTP requests are synchronous. If the AI takes 60 seconds to process a video, an HTTP request would time out. A message queue allows the user to get an immediate "Received" response, while the heavy lifting happens asynchronously in the background.

Why Deepgram? Deepgram offers streaming transcription, which allows us to pipe the S3 audio stream directly to their API without loading the entire audio file into the Worker's memory.

About

An event-driven AI SaaS that repurposes videos into viral blogs, LinkedIn posts, and Twitter threads. Built with Next.js, NestJS, RabbitMQ, Deepgram, and DeepSeek.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •