A complete production-ready product ingestion and synchronization platform with React frontend and Python FastAPI backend.
- Overview
- Architecture
- Features
- Tech Stack
- Project Structure
- Installation
- API Documentation
- Frontend Structure
- Database Schema
- Development Guide
- Deployment
Arkhon Integration Platform is a comprehensive system for managing product feeds from external sources (like BigBuy), processing them through an ETL pipeline, and synchronizing product data with outbound systems. The platform provides a complete admin interface for managing products, categories, manufacturers, and monitoring system operations.
βββββββββββββββ ββββββββββββββββ βββββββββββββββ
β React βββββββΆβ FastAPI βββββββΆβ PostgreSQL β
β Frontend β β Backend β β Database β
βββββββββββββββ ββββββββββββββββ βββββββββββββββ
β
βββββββΆ Redis (Cache/Queue)
β
βββββββΆ Celery Workers
β
βββββββΆ FTP Servers
-
Frontend (React + TypeScript)
- Modern SPA with React 18
- TailwindCSS for styling
- React Query for data fetching
- Zustand for state management
- React Router for navigation
-
Backend (FastAPI + Python)
- Async/await throughout
- JWT authentication
- SQLAlchemy 2.x ORM
- Pydantic validation
- Structured logging
-
Background Workers (Celery)
- FTP feed fetching
- CSV import processing
- Outbound synchronization
- Scheduled tasks
-
Database (PostgreSQL/MySQL)
- EAV model for flexible attributes
- Full-text search capabilities
- Optimized indexes
- JWT-based authentication with refresh tokens
- Role-based access control (admin, staff, viewer)
- Token rotation for security
- Session management
- Real-time system metrics
- Product statistics
- Feed processing status
- Activity timeline
- Visual charts and graphs
- Automated FTP feed fetching
- CSV import with progress tracking
- Server-side filtering and pagination
- Feed retry mechanism
- Detailed import logs
- File preview
- Comprehensive product CRUD
- Advanced filtering and search
- Bulk update operations (price/quantity)
- EAV attribute system
- Image gallery
- Category management
- Manufacturer management
- Queue-based job processing
- Retry mechanism with exponential backoff
- Detailed job logs
- Status tracking
- System-wide logging
- Real-time log tailing
- Filterable log viewer
- Cron job monitoring
- Manual task triggering
- Framework: FastAPI 0.109+
- Database: PostgreSQL/MySQL with async support
- ORM: SQLAlchemy 2.x (async)
- Migrations: Alembic
- Queue: Celery + Redis
- Authentication: python-jose (JWT)
- Validation: Pydantic
- Testing: pytest + testcontainers
- Framework: React 18 + TypeScript
- Build Tool: Vite
- Styling: TailwindCSS
- State Management: Zustand
- Data Fetching: TanStack Query (React Query)
- Forms: React Hook Form + Zod
- Routing: React Router v6
- Charts: Recharts
- Icons: Lucide React
- Containerization: Docker + Docker Compose
- Web Server: Nginx (production)
- Process Manager: Uvicorn (ASGI)
- Monitoring: Prometheus metrics
arkhon/
βββ backend/
β βββ alembic/ # Database migrations
β β βββ versions/
β β βββ env.py
β β βββ script.py.mako
β βββ app/
β β βββ api/
β β β βββ v1/ # API endpoints
β β β β βββ auth.py
β β β β βββ dashboard.py
β β β β βββ feeds.py
β β β β βββ products.py
β β β β βββ outbound.py
β β β β βββ manufacturers.py
β β β β βββ categories.py
β β β β βββ attributes.py
β β β β βββ logs.py
β β β β βββ settings.py
β β β β βββ cron.py
β β β βββ dependencies.py
β β βββ core/
β β β βββ config.py # Settings
β β β βββ database.py # DB connection
β β β βββ security.py # Auth utils
β β βββ models/ # SQLAlchemy models
β β β βββ user.py
β β β βββ feed.py
β β β βββ product.py
β β β βββ eav.py
β β β βββ outbound.py
β β β βββ log.py
β β βββ schemas/ # Pydantic schemas
β β βββ services/
β β β βββ ftp_service.py
β β β βββ import_service.py
β β βββ workers/
β β β βββ celery_app.py
β β β βββ tasks.py
β β βββ main.py # FastAPI app
β βββ requirements.txt
β βββ Dockerfile
β βββ .env.example
βββ frontend/
β βββ src/
β β βββ components/
β β β βββ layout/ # Layout components
β β β βββ shared/ # Reusable UI components
β β β βββ auth/ # Auth components
β β β βββ dashboard/ # Dashboard widgets
β β β βββ feeds/ # Feed components
β β β βββ products/ # Product components
β β β βββ ...
β β βββ pages/ # Page components
β β βββ services/ # API services
β β βββ stores/ # Zustand stores
β β βββ hooks/ # Custom hooks
β β βββ types/ # TypeScript types
β β βββ utils/ # Utilities
β β βββ App.tsx
β β βββ main.tsx
β β βββ index.css
β βββ package.json
β βββ tsconfig.json
β βββ vite.config.ts
β βββ tailwind.config.js
β βββ Dockerfile
βββ docker-compose.yml
βββ README.md
For Local Development (Docker):
- Docker Engine 20.10+
- Docker Compose 2.0+
- Git
For Manual Installation:
- Python 3.11+
- Node.js 20+
- PostgreSQL 14+ or MySQL 8+
- Redis 7+
- Git
For Production Server:
- Ubuntu 20.04+ / Debian 11+ / CentOS 8+
- Docker & Docker Compose
- Domain name (optional but recommended)
- SSL certificate (Let's Encrypt recommended)
This is the recommended method for local development and testing.
git clone https://github.com/your-org/arkhon.git
cd arkhon# Copy the example environment file
cp backend/.env.example backend/.envEdit backend/.env with your preferred text editor:
nano backend/.env
# or
vim backend/.env
# or
code backend/.envRequired Configuration:
# Application
APP_NAME=Arkhon Integration Platform
DEBUG=True # Set to False in production
# Security (CHANGE THIS!)
SECRET_KEY=your-super-secret-key-change-this-in-production-use-at-least-32-chars
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
# Database (Docker defaults)
DATABASE_URL=postgresql+asyncpg://arkhon:arkhon_password@postgres:5432/arkhon
# Redis (Docker defaults)
REDIS_URL=redis://redis:6379/0
CELERY_BROKER_URL=redis://redis:6379/0
CELERY_RESULT_BACKEND=redis://redis:6379/0
# CORS (Allow frontend access)
BACKEND_CORS_ORIGINS=["http://localhost:3000","http://localhost:5173"]
# File Storage
STORAGE_TYPE=local
LOCAL_STORAGE_PATH=./storage
# FTP Settings (Configure for your data source)
BIGBUY_FTP_HOST=ftp.bigbuy.eu
BIGBUY_FTP_PORT=21
BIGBUY_FTP_USER=your-ftp-username
BIGBUY_FTP_PASSWORD=your-ftp-password
BIGBUY_FTP_PATH=/products
# Import Settings
CSV_CHUNK_SIZE=1000
MAX_IMPORT_ERRORS=100
# Logging
LOG_LEVEL=INFO
LOG_FORMAT=jsonGenerate a secure SECRET_KEY:
python -c "import secrets; print(secrets.token_urlsafe(32))"# Start all containers in detached mode
docker-compose up -d
# Verify all services are running
docker-compose psYou should see 6 services running:
arkhon-postgres(Database)arkhon-redis(Cache/Queue)arkhon-backend(FastAPI)arkhon-celery-worker(Background tasks)arkhon-celery-beat(Scheduler)arkhon-frontend(React app)
# Run database migrations
docker-compose exec backend alembic upgrade head
# Verify migration success
docker-compose exec backend alembic current# Create the admin user
docker-compose exec backend python -c "
from app.core.database import AsyncSessionLocal
from app.models.user import User
from app.core.security import get_password_hash
import asyncio
async def create_admin():
async with AsyncSessionLocal() as db:
# Check if admin already exists
from sqlalchemy import select
result = await db.execute(select(User).where(User.email == 'admin@arkhon.com'))
existing = result.scalar_one_or_none()
if existing:
print('Admin user already exists')
return
admin = User(
email='admin@arkhon.com',
username='admin',
hashed_password=get_password_hash('admin123'),
full_name='System Administrator',
role='admin',
is_active=True
)
db.add(admin)
await db.commit()
print('Admin user created successfully!')
print('Email: admin@arkhon.com')
print('Password: admin123')
print('β οΈ IMPORTANT: Change this password after first login!')
asyncio.run(create_admin())
"Open your browser and navigate to:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000
- API Documentation: http://localhost:8000/docs
- OpenAPI Spec: http://localhost:8000/redoc
Default Login Credentials:
- Email:
admin@arkhon.com - Password:
admin123
# Check backend logs
docker-compose logs -f backend
# Check celery worker logs
docker-compose logs -f celery-worker
# Check frontend logs
docker-compose logs -f frontend
# View all logs
docker-compose logs -f# Stop all services
docker-compose down
# Stop and remove volumes (β οΈ deletes all data)
docker-compose down -v
# Restart a specific service
docker-compose restart backend
# View logs for a specific service
docker-compose logs -f backend
# Execute commands in a container
docker-compose exec backend bash
# Rebuild containers after code changes
docker-compose up -d --build
# View resource usage
docker statsComplete guide for deploying to a production server.
- Ubuntu 20.04+ server with at least 2GB RAM
- Root or sudo access
- Domain name pointed to your server (optional)
- Open ports: 80 (HTTP), 443 (HTTPS), 22 (SSH)
# Update system packages
sudo apt update && sudo apt upgrade -y
# Install required packages
sudo apt install -y git curl wget nano ufw
# Configure firewall
sudo ufw allow 22/tcp # SSH
sudo ufw allow 80/tcp # HTTP
sudo ufw allow 443/tcp # HTTPS
sudo ufw --force enable
# Verify firewall status
sudo ufw status# Install Docker
curl -fsSL https://get.docker.com -o get-docker.sh
sudo sh get-docker.sh
# Add your user to docker group
sudo usermod -aG docker $USER
# Install Docker Compose
sudo curl -L "https://github.com/docker/compose/releases/latest/download/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
# Verify installations
docker --version
docker-compose --version
# Logout and login again for group changes to take effect
exit
# SSH back into the server# Create application directory
sudo mkdir -p /opt/arkhon
sudo chown $USER:$USER /opt/arkhon
cd /opt/arkhon
# Clone repository
git clone https://github.com/your-org/arkhon.git .
# Create production environment file
cp backend/.env.example backend/.env
nano backend/.envProduction Environment Configuration:
# Application
APP_NAME=Arkhon Integration Platform
APP_VERSION=1.0.0
DEBUG=False # IMPORTANT: Must be False in production
# Security - GENERATE NEW SECRET KEY!
SECRET_KEY=CHANGE-THIS-TO-A-SECURE-RANDOM-STRING-AT-LEAST-32-CHARACTERS
ALGORITHM=HS256
ACCESS_TOKEN_EXPIRE_MINUTES=30
REFRESH_TOKEN_EXPIRE_DAYS=7
# Database - Use strong passwords!
DATABASE_URL=postgresql+asyncpg://arkhon:STRONG_DB_PASSWORD_HERE@postgres:5432/arkhon
# Redis
REDIS_URL=redis://redis:6379/0
CELERY_BROKER_URL=redis://redis:6379/0
CELERY_RESULT_BACKEND=redis://redis:6379/0
# CORS - Add your domain
BACKEND_CORS_ORIGINS=["https://yourdomain.com","https://www.yourdomain.com"]
# File Storage
STORAGE_TYPE=local
LOCAL_STORAGE_PATH=/app/storage
# FTP Configuration
BIGBUY_FTP_HOST=ftp.bigbuy.eu
BIGBUY_FTP_PORT=21
BIGBUY_FTP_USER=your-production-ftp-user
BIGBUY_FTP_PASSWORD=your-production-ftp-password
BIGBUY_FTP_PATH=/products
# Import & Sync Settings
CSV_CHUNK_SIZE=1000
MAX_IMPORT_ERRORS=100
OUTBOUND_RETRY_ATTEMPTS=3
OUTBOUND_RETRY_DELAY=5
# Logging
LOG_LEVEL=INFO
LOG_FORMAT=jsonGenerate secure credentials:
# Generate SECRET_KEY
python3 -c "import secrets; print(secrets.token_urlsafe(32))"
# Generate database password
python3 -c "import secrets; print(secrets.token_urlsafe(24))"Create a production docker-compose override:
nano docker-compose.prod.ymlversion: '3.8'
services:
postgres:
restart: always
environment:
POSTGRES_PASSWORD: ${DB_PASSWORD} # Use strong password from .env
volumes:
- postgres_data:/var/lib/postgresql/data
redis:
restart: always
command: redis-server --appendonly yes
volumes:
- redis_data:/data
backend:
restart: always
environment:
- DEBUG=False
build:
context: ./backend
dockerfile: Dockerfile
volumes:
- storage_data:/app/storage
celery-worker:
restart: always
environment:
- DEBUG=False
celery-beat:
restart: always
environment:
- DEBUG=False
frontend:
restart: always
build:
context: ./frontend
dockerfile: Dockerfile
target: production
ports:
- "80:80"
environment:
- VITE_API_URL=https://yourdomain.com/api/v1
volumes:
postgres_data:
redis_data:
storage_data:If using a custom domain with SSL:
# Install Nginx
sudo apt install -y nginx certbot python3-certbot-nginx
# Create Nginx configuration
sudo nano /etc/nginx/sites-available/arkhon# Redirect HTTP to HTTPS
server {
listen 80;
server_name yourdomain.com www.yourdomain.com;
return 301 https://$server_name$request_uri;
}
# HTTPS Configuration
server {
listen 443 ssl http2;
server_name yourdomain.com www.yourdomain.com;
# SSL certificates (will be added by certbot)
ssl_certificate /etc/letsencrypt/live/yourdomain.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/yourdomain.com/privkey.pem;
# Frontend
location / {
proxy_pass http://localhost:3000;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
# Backend API
location /api {
proxy_pass http://localhost:8000;
proxy_http_version 1.1;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_set_header Host $host;
# Increase timeouts for long-running requests
proxy_connect_timeout 600;
proxy_send_timeout 600;
proxy_read_timeout 600;
send_timeout 600;
}
# API Documentation
location /docs {
proxy_pass http://localhost:8000/docs;
}
# Security headers
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-XSS-Protection "1; mode=block" always;
# Gzip compression
gzip on;
gzip_vary on;
gzip_min_length 1024;
gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml+rss application/javascript application/json;
}# Enable site
sudo ln -s /etc/nginx/sites-available/arkhon /etc/nginx/sites-enabled/
sudo rm /etc/nginx/sites-enabled/default
# Test configuration
sudo nginx -t
# Obtain SSL certificate
sudo certbot --nginx -d yourdomain.com -d www.yourdomain.com
# Restart Nginx
sudo systemctl restart nginx
# Enable auto-renewal
sudo systemctl enable certbot.timercd /opt/arkhon
# Build images
docker-compose -f docker-compose.yml -f docker-compose.prod.yml build
# Start services
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
# Verify all containers are running
docker-compose ps# Run migrations
docker-compose exec backend alembic upgrade head
# Create admin user
docker-compose exec backend python -c "
from app.core.database import AsyncSessionLocal
from app.models.user import User
from app.core.security import get_password_hash
import asyncio
async def create_admin():
async with AsyncSessionLocal() as db:
from sqlalchemy import select
result = await db.execute(select(User).where(User.email == 'admin@arkhon.com'))
existing = result.scalar_one_or_none()
if existing:
print('Admin user already exists')
return
admin = User(
email='admin@arkhon.com',
username='admin',
hashed_password=get_password_hash('CHANGE_THIS_PASSWORD'),
full_name='System Administrator',
role='admin',
is_active=True
)
db.add(admin)
await db.commit()
print('Admin user created!')
asyncio.run(create_admin())
"# Create backup script
sudo nano /usr/local/bin/backup-arkhon.sh#!/bin/bash
BACKUP_DIR="/opt/backups/arkhon"
DATE=$(date +%Y%m%d_%H%M%S)
mkdir -p $BACKUP_DIR
# Backup database
docker exec arkhon-postgres pg_dump -U arkhon arkhon | gzip > $BACKUP_DIR/db_$DATE.sql.gz
# Backup storage files
tar -czf $BACKUP_DIR/storage_$DATE.tar.gz /opt/arkhon/storage/
# Keep only last 7 days of backups
find $BACKUP_DIR -name "*.gz" -mtime +7 -delete
echo "Backup completed: $DATE"# Make executable
sudo chmod +x /usr/local/bin/backup-arkhon.sh
# Add to crontab (daily at 2 AM)
sudo crontab -e
# Add this line:
0 2 * * * /usr/local/bin/backup-arkhon.sh >> /var/log/arkhon-backup.log 2>&1# Install monitoring tools
sudo apt install -y htop nethogs
# View system resources
htop
# Monitor Docker containers
docker stats
# View application logs
docker-compose logs -f backend
docker-compose logs -f celery-workerβ Security:
- Changed default SECRET_KEY
- Changed default admin password
- Set DEBUG=False
- Configured firewall (UFW)
- SSL certificate installed
- Strong database passwords
- Updated CORS origins
β Services:
- All containers running
- Database migrations applied
- Admin user created
- Celery workers operational
- Celery beat scheduler running
β Monitoring:
- Automated backups configured
- Log rotation setup
- Health checks working
- Error notifications configured
β Performance:
- Database indexed properly
- Redis persistence enabled
- Nginx caching configured
- Connection pooling enabled
# View logs
docker-compose logs -f [service-name]
# Restart services
docker-compose restart
# Update application
cd /opt/arkhon
git pull
docker-compose build
docker-compose up -d
# Backup database manually
docker exec arkhon-postgres pg_dump -U arkhon arkhon > backup.sql
# Restore database
cat backup.sql | docker exec -i arkhon-postgres psql -U arkhon arkhon
# Clean up old images
docker system prune -a
# Monitor resource usage
docker statsServices won't start:
# Check logs
docker-compose logs backend
docker-compose logs postgres
# Verify environment variables
docker-compose config
# Check disk space
df -hDatabase connection errors:
# Check PostgreSQL is running
docker-compose ps postgres
# Test database connection
docker-compose exec postgres psql -U arkhon -d arkhonHigh memory usage:
# Check resource usage
docker stats
# Restart services
docker-compose restart
# Adjust worker count in docker-compose.ymlcd arkhon
git pull
docker-compose down
docker-compose up -d --build
docker-compose exec backend alembic upgrade headcd /opt/arkhon
# Backup first!
/usr/local/bin/backup-arkhon.sh
# Pull updates
git pull
# Rebuild and restart
docker-compose -f docker-compose.yml -f docker-compose.prod.yml down
docker-compose -f docker-compose.yml -f docker-compose.prod.yml build
docker-compose -f docker-compose.yml -f docker-compose.prod.yml up -d
# Run migrations
docker-compose exec backend alembic upgrade head
# Verify
docker-compose psLogin with credentials
{
"email": "user@example.com",
"password": "password123"
}Response:
{
"access_token": "eyJ...",
"refresh_token": "eyJ...",
"token_type": "bearer"
}Refresh access token
{
"refresh_token": "eyJ..."
}Logout user
{
"refresh_token": "eyJ..."
}Get current user info
Get dashboard statistics
Response:
{
"total_products": 15420,
"total_outbound_jobs": 342,
"pending_inbound_feeds": 2,
"failed_inbound_feeds": 1,
"queue_status": {
"pending": 5
},
"recent_activity": {
"products_last_7_days": 1250,
"feeds_last_7_days": 7
}
}Get activity timeline
List inbound feeds (paginated)
Query Parameters:
page: Page number (default: 1)page_size: Items per page (default: 20)entity_code: Filter by entitystatus: Filter by statussearch: Search textstart_date: Filter by date rangeend_date: Filter by date range
Get feed details
Get feed processing logs
Retry failed feed import
List products (paginated)
Query Parameters:
page,page_size: Paginationsku,name: Filter by textcategory_id,manufacturer_id: Filter by relationsmin_price,max_price: Price rangemin_stock,max_stock: Stock rangesearch: Full-text search
Get product details
Update product
Bulk update products
Request:
{
"product_ids": [1, 2, 3],
"target_field": "price",
"mode": "percent",
"value": 10,
"reason": "10% price increase"
}List outbound jobs
Get job details
Retry failed job
List manufacturers
Create manufacturer
Get manufacturer
Update manufacturer
Delete manufacturer
List categories
Get category
Get products in category
List attributes
Create attribute
Get attribute
Update attribute
Delete attribute
List system logs
Tail logs (real-time)
Get settings
Update settings
Get cron status
Trigger task manually
Available tasks:
fetch_feedsimport_feedssync_outbound
App
βββ SidebarLayout
β βββ VerticalSideNav
β βββ TopBar
β βββ PageContainer
β βββ [Page Components]
SidebarLayout: Main layout with sidebarVerticalSideNav: Navigation sidebarTopBar: Top navigation barPageContainer: Page wrapper
DataTable: Reusable table with paginationFilterBar: Filter controlsPagination: Pagination componentModal: Modal dialogConfirmDialog: Confirmation dialogLoadingSpinner: Loading indicatorEmptyState: Empty state messageErrorBanner: Error display
TextInput,NumberInput,TextareaSelect,MultiSelectDateRangePickerSearchInputToggle
const { user, isAuthenticated, login, logout } = useAuthStore();const { data, isLoading, error } = useQuery({
queryKey: ['products', filters],
queryFn: () => productsApi.list(filters),
});users
- Authentication and user management
- Role-based access control
inbound_feeds
- Track imported feed files
- Processing status and statistics
outbound_jobs
- Outbound synchronization jobs
- Retry mechanism
products
- Core product data
- Unique constraint on (sku, entity_code)
manufacturers
- Manufacturer information
categories
- Hierarchical category structure
- Path-based navigation
attributes (EAV)
- Flexible attribute definitions
- Support multiple data types
attribute_values (EAV)
- Product attribute values
- Type-specific columns
system_logs
- Comprehensive system logging
- Create schema in
backend/app/schemas/ - Add endpoint in
backend/app/api/v1/ - Register in router
Example:
# schemas/widget.py
class WidgetResponse(BaseModel):
id: int
name: str
# api/v1/widgets.py
@router.get("", response_model=PaginatedResponse[WidgetResponse])
async def list_widgets(db: AsyncSession = Depends(get_db)):
# Implementation
passcd backend
pytest
pytest --cov=app tests/# Create migration
alembic revision --autogenerate -m "Description"
# Apply migration
alembic upgrade head
# Rollback
alembic downgrade -1- Create page component in
src/pages/ - Add route in
App.tsx - Add navigation link in
SidebarLayout
// src/services/index.ts
export const widgetsApi = {
list: (params: FilterParams) =>
apiClient.get<PaginatedResponse<Widget>>('/widgets', params),
};
// In component
const { data } = useQuery({
queryKey: ['widgets'],
queryFn: () => widgetsApi.list({}),
});cd frontend
npm run build- Build images
docker-compose -f docker-compose.prod.yml build- Configure secrets
- Use environment variables or secrets management
- Update
SECRET_KEY - Configure database credentials
- Set FTP credentials
- Run migrations
docker-compose -f docker-compose.prod.yml run backend alembic upgrade head- Start services
docker-compose -f docker-compose.prod.yml up -dSee backend/.env.example for all configuration options.
- Scale Celery workers:
docker-compose up -d --scale celery-worker=3- Use load balancer for backend API
- Implement CDN for frontend static assets
- Use managed database service
MIT License - see LICENSE file
- Fork the repository
- Create feature branch (
git checkout -b feature/AmazingFeature) - Commit changes (
git commit -m 'Add AmazingFeature') - Push to branch (
git push origin feature/AmazingFeature) - Open Pull Request
For issues and questions, please open an issue on GitHub.
Built with β€οΈ using React, FastAPI, and modern web technologies