Skip to content

theapprenticeproject/ai_videos

Repository files navigation

SaaS Video Generator

This project is a Next.js web application that uses AI to generate educational videos. It leverages various APIs for script generation, audio synthesis, image/video generation, and rendering.

Configuration

To successfully run this project, you need to configure environment variables and authentication files.

1. Environment Variables (.env)

Create a .env file in the root directory and add the following keys:

# LLM & AI Services
NEXT_PUBLIC_LLM_API_KEY=     # Key for your primary LLM provider
GEMINI_API_KEY=              # Google Gemini API Key

# Audio Services
NEXT_PUBLIC_AUDIO_API_KEY=   # Generic Audio API Key (if used)
NEXT_PUBLIC_TRANSCRIPT_API_KEY= # Transcription Service API Key
ELEVENLABS_API_KEY=          # ElevenLabs API Key for high-quality TTS

# Google Services
GOOGLE_API_KEY=              # General Google API Key
GOOGLE_SEARCH_API_KEY=       # Custom Search JSON API Key
GOOGLE_SEARCH_CXID=          # Custom Search Engine ID (CX)

# Media Generation
FREEPIK_API_KEY=             # Freepik API Key for image/video generation

2. dynamication.json

This file (dynamication.json) controls the "personality" and logic of the content generation.

  • scriptFewShot: Contains "few-shot" prompting examples (low and high complexity) used by the LLM to generate scripts. You can edit this to change the writing style of the generated videos.
  • avatars: A list of available AI avatars/voices. Each entry has:
    • value: The Voice ID (e.g., ElevenLabs ID).
    • label: Display name.
    • langType: Supported language/accent.
    • gender: MALE/FEMALE.
    • languageCode: BCP-47 language tag (e.g., hi-IN).
  • avatarAudioMap: Maps voice IDs to a sample audio URL for previewing voices on the frontend.

3. Google Cloud Authentication (available in mediaApis)

The directory app/mediaApis contains a Google Service Account JSON key file (e.g., axiomatic-treat-xxxx.json).

  • This file is AXIOMATIC (essential/foundational) for Google Cloud services (like Vertex AI or Cloud Storage) to function.
  • Ensure this file is present and correctly referenced by the code (e.g., in vertex.ts or google-cloud initialization).
  • Do not commit real credentials to public repositories.

Usage

Installation

npm install

Running the Development Server

npm run dev

The application will be available at http://localhost:3000.

Video Editor / Preview

This project uses Revideo. You can start the editor to refine templates:

npm run revideo:editor

Debugging

Debug Render (debug_render_data.json)

To debug the video rendering process without regenerating all assets (which costs money and time):

  1. Save Data: The system saves generation data to data/debug_render_data.json during a successful run.
  2. Static Generation: You can trigger a "Static Gen" mode.
    • When the staticGen flag is set to true (see app/videoGenerator.ts), the system skips LLM/Media generation.
    • It directly reads parameters from data/debug_render_data.json.
    • It passes these parameters to renderPersonalizedVideo.

This allows you to tweak the rendering logic (React components, animations) and instantly see the result using previously generated assets.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published