An AI-powered digital twin application deployed on AWS, featuring a conversational interface powered by Amazon Bedrock and a modern Next.js frontend.
This project creates a personalized AI digital twin that can answer questions and engage in conversations based on custom training data. The application is fully deployed on AWS infrastructure using Infrastructure as Code (Terraform) with automated CI/CD via GitHub Actions.
The application uses a serverless architecture on AWS:
- Frontend: Next.js static site hosted on S3 and served via CloudFront CDN
- Backend: Python FastAPI application running on AWS Lambda
- AI Model: Amazon Bedrock (Nova Micro) for conversational AI
- Memory: S3 bucket for persistent conversation history
- API: API Gateway for HTTP endpoints
- Infrastructure: Terraform for infrastructure management
- CI/CD: GitHub Actions for automated deployment
- π¬ Real-time chat interface with conversational AI
- π¨ Modern, responsive UI built with Next.js and Tailwind CSS
- π§ Persistent conversation memory using S3
- π Session management for multi-turn conversations
- π Fully automated deployment pipeline
- π CloudFront CDN for global low-latency access
- π Secure API communication
- π Multiple environment support (dev, test, prod)
- Next.js 15 - React framework with static export
- TypeScript - Type-safe development
- Tailwind CSS - Utility-first styling
- Lucide React - Icon library
- Python 3.12 - Runtime
- FastAPI - Modern web framework
- LangChain - LLM orchestration
- Amazon Bedrock - AI foundation models
- uv - Fast Python package manager
- Mangum - ASGI adapter for AWS Lambda
- Terraform - Infrastructure as Code
- AWS S3 - Static hosting & storage
- AWS CloudFront - CDN
- AWS Lambda - Serverless compute
- AWS API Gateway - HTTP API
- AWS Bedrock - AI services
- GitHub Actions - CI/CD
- AWS Account with appropriate permissions
- GitHub account
- AWS CLI configured
- Terraform >= 1.0
- Node.js >= 20
- Python >= 3.12
- uv (Python package manager)
git clone <your-repo-url>
cd twinConfigure AWS credentials for GitHub Actions:
- Create an OIDC provider in AWS for GitHub Actions
- Create an IAM role with necessary permissions
- Add these secrets to your GitHub repository:
AWS_ROLE_ARNAWS_ACCOUNT_IDDEFAULT_AWS_REGION
cd terraform
terraform init
terraform apply -target="aws_s3_bucket.terraform_state" \
-target="aws_s3_bucket_versioning.terraform_state" \
-target="aws_dynamodb_table.terraform_locks"Push to the main branch to trigger automatic deployment:
git push origin mainOr manually trigger deployment via GitHub Actions workflow.
twin/
βββ backend/ # Python FastAPI backend
β βββ api.py # Main API application
β βββ agent.py # LangChain agent logic
β βββ deploy.py # Lambda packaging script
β βββ data/ # Training data (not in git)
β βββ pyproject.toml # Python dependencies
βββ frontend/ # Next.js frontend
β βββ app/ # App router pages
β βββ components/ # React components
β βββ public/ # Static assets
βββ terraform/ # Infrastructure as Code
β βββ main.tf # Main infrastructure
β βββ outputs.tf # Terraform outputs
β βββ variables.tf # Input variables
β βββ prod.tfvars # Production configuration
βββ scripts/ # Deployment scripts
β βββ deploy.sh # Main deployment script
β βββ destroy.sh # Teardown script
βββ .github/
βββ workflows/ # CI/CD pipelines
βββ deploy.yml # Deployment workflow
βββ destroy.yml # Destruction workflow
The backend uses these environment variables (set by Terraform):
S3_MEMORY_BUCKET- S3 bucket for conversation memoryBEDROCK_MODEL_ID- Bedrock model identifierMEMORY_DIR- Local directory for development
Key variables in terraform/variables.tf:
project_name- Resource name prefixenvironment- Deployment environment (dev/test/prod)bedrock_model_id- AI model to useuse_custom_domain- Enable custom domain (prod only)root_domain- Your custom domain name
The project supports three environments using Terraform workspaces:
- dev - Development environment
- test - Testing environment
- prod - Production with optional custom domain
Each environment maintains separate infrastructure and state.
The automated deployment pipeline:
- Builds Lambda deployment package
- Initializes Terraform with remote S3 backend
- Applies infrastructure changes
- Builds Next.js frontend with API URL
- Syncs frontend to S3
- Invalidates CloudFront cache for immediate updates
To destroy all resources in an environment:
./scripts/destroy.sh <environment>Or use the GitHub Actions destroy workflow.
- Personal training data in
backend/data/is excluded from version control - Conversation memory is stored in private S3 buckets
- API Gateway provides secure endpoint exposure
- CloudFront serves content over HTTPS
- OIDC for GitHub Actions eliminates long-lived credentials
This project is part of an AI in Production course.
This is a learning project. Feel free to fork and adapt for your own use!
If updates don't appear after deployment, CloudFront cache may need invalidation:
cd terraform
DIST_ID=$(terraform output -raw cloudfront_distribution_id)
aws cloudfront create-invalidation --distribution-id $DIST_ID --paths "/*"The deployment script now handles this automatically.
Rebuild the package manually:
cd backend
uv run deploy.pyBuilt with β€οΈ as part of AI in Production course