This project is a Next.js web application that uses AI to generate educational videos. It leverages various APIs for script generation, audio synthesis, image/video generation, and rendering.
To successfully run this project, you need to configure environment variables and authentication files.
Create a .env file in the root directory and add the following keys:
# LLM & AI Services
NEXT_PUBLIC_LLM_API_KEY= # Key for your primary LLM provider
GEMINI_API_KEY= # Google Gemini API Key
# Audio Services
NEXT_PUBLIC_AUDIO_API_KEY= # Generic Audio API Key (if used)
NEXT_PUBLIC_TRANSCRIPT_API_KEY= # Transcription Service API Key
ELEVENLABS_API_KEY= # ElevenLabs API Key for high-quality TTS
# Google Services
GOOGLE_API_KEY= # General Google API Key
GOOGLE_SEARCH_API_KEY= # Custom Search JSON API Key
GOOGLE_SEARCH_CXID= # Custom Search Engine ID (CX)
# Media Generation
FREEPIK_API_KEY= # Freepik API Key for image/video generationThis file (dynamication.json) controls the "personality" and logic of the content generation.
- scriptFewShot: Contains "few-shot" prompting examples (low and high complexity) used by the LLM to generate scripts. You can edit this to change the writing style of the generated videos.
- avatars: A list of available AI avatars/voices. Each entry has:
value: The Voice ID (e.g., ElevenLabs ID).label: Display name.langType: Supported language/accent.gender: MALE/FEMALE.languageCode: BCP-47 language tag (e.g.,hi-IN).
- avatarAudioMap: Maps voice IDs to a sample audio URL for previewing voices on the frontend.
The directory app/mediaApis contains a Google Service Account JSON key file (e.g., axiomatic-treat-xxxx.json).
- This file is AXIOMATIC (essential/foundational) for Google Cloud services (like Vertex AI or Cloud Storage) to function.
- Ensure this file is present and correctly referenced by the code (e.g., in
vertex.tsorgoogle-cloudinitialization). - Do not commit real credentials to public repositories.
npm installnpm run devThe application will be available at http://localhost:3000.
This project uses Revideo. You can start the editor to refine templates:
npm run revideo:editorTo debug the video rendering process without regenerating all assets (which costs money and time):
- Save Data: The system saves generation data to
data/debug_render_data.jsonduring a successful run. - Static Generation: You can trigger a "Static Gen" mode.
- When the
staticGenflag is set totrue(seeapp/videoGenerator.ts), the system skips LLM/Media generation. - It directly reads parameters from
data/debug_render_data.json. - It passes these parameters to
renderPersonalizedVideo.
- When the
This allows you to tweak the rendering logic (React components, animations) and instantly see the result using previously generated assets.