DataFlow is a production-ready autonomous data analysis system with a beautiful glassmorphism UI. Upload CSV files and get AI-powered insights, interactive charts, predictions, and comprehensive analytics.
- CSV Upload and Processing - Drag and drop CSV files for instant analysis
- Auto Data Profiling - Automatic data type detection, statistics, and quality assessment
- Data Cleaning - Automatic handling of missing values, outliers, and duplicates
- Feature Engineering - Intelligent feature creation and transformation
- Smart Chart Selection - AI recommends the best visualization for your data
- Automated Insights - Generate business insights using LLM (Hyperbolic API)
- Anti-Hallucination - Validates AI-generated insights against actual data
- Custom Analytics - Tableau and Power BI style free-form workspace
- 30+ Chart Types - Bar, line, area, pie, donut, treemap, scatter, heatmap, and more
- Interactive Charts - Zoom, pan, hover tooltips, real-time updates
- Prediction Charts - Trend projection, moving averages, growth rate forecasting
- Custom Analytics - Build custom dashboards with multiple charts
- My Dashboard - Pin and arrange favorite visualizations
- PDF Export - Export dashboards and reports as PDF
- HTML Reports - Beautiful dark-themed reports with all analytics
- Currency-Aware KPIs - Support for USD, IDR, EUR, GBP, JPY, SGD
- Docker and Docker Compose
- Python 3.12+ (for local development)
- Node.js 20+ (for frontend development)
# Clone the repository
git clone https://github.com/Adefebrian/DataFlow.git
cd DataFlow
# Start production (nginx static build)
make up
# Or without make:
docker compose --profile prod up -d --build# Start with Vite hot-reload
make dev
# Or without make:
docker compose --profile dev up -dAccess: http://localhost:3000
Login: admin / admin123
# Create virtual environment
python -m venv venv
source venv/bin/activate # Linux/Mac
# or: venv\Scripts\activate # Windows
# Install dependencies
pip install -r requirements.txt
# Copy environment file
cp .env.example .env
# Edit .env with your settings
# Run the server
python main.pycd frontend
npm install
npm run devCreate a .env file based on .env.example:
| Variable | Description | Default |
|---|---|---|
| DATABASE_URL | PostgreSQL connection string | SQLite (local) |
| REDIS_URL | Redis connection string | None |
| HYPERBOLIC_API_KEY | API key for AI insights | Required |
| USE_SQLITE | Use SQLite instead of PostgreSQL | true |
| R2_* | Cloudflare R2 storage (optional) | None |
- Visit hyperbolic.xyz
- Sign up for an account
- Generate an API key
- Add it to your
.envfile
DataFlow/
├── Makefile # Docker shortcuts
├── docker-compose.yml # Multi-container setup
├── Dockerfile # Backend (Python/FastAPI)
├── requirements.txt # Python dependencies
├── main.py # Entry point
├── .env.example # Environment template
├── .gitignore # Git ignore rules
│
├── src/
│ ├── api/
│ │ ├── routes.py # Main API routes
│ │ ├── charts.py # Chart generation
│ │ ├── insights.py # AI insights
│ │ ├── prediction_charts.py # Forecasting
│ │ └── ...
│ ├── agents/
│ │ ├── data_profiler.py # Data analysis
│ │ ├── feature_engineer.py # Feature creation
│ │ ├── report_generator.py # Report generation
│ │ └── ...
│ ├── services/
│ │ ├── llm.py # LLM integration
│ │ ├── storage.py # File storage
│ │ └── ...
│ └── ...
│
└── frontend/
├── Dockerfile # Frontend build
├── nginx.conf # Production server
├── vite.config.ts # Vite configuration
├── package.json # Node dependencies
└── src/
├── pages/ # React pages
├── components/ # UI components
├── hooks/ # Custom hooks
└── ...
| Method | Endpoint | Description |
|---|---|---|
| POST | /auth/login | User authentication |
| POST | /upload | Upload CSV file |
| POST | /pipeline/run | Start analysis pipeline |
| GET | /pipeline/{job_id}/status | Get job status |
| GET | /pipeline/{job_id}/analytics | Get analytics results |
| GET | /pipeline/all | List all jobs |
| GET | /health | Health check |
make up # Production build
make dev # Development mode
make down # Stop all containers
make rebuild # Force rebuild
make logs # View all logs
make logs-api # View API logs
make shell # Shell into container- Push to GitHub
- Connect GitHub to Render
- Create a new Web Service
- Settings:
- Build Command: pip install -r requirements.txt
- Start Command: uvicorn src.api.routes:app --host 0.0.0.0 --port $PORT
- Install Railway CLI
- Run: railway init
- Run: railway up
- Set environment variables in Railway dashboard
- Install Fly CLI
- Run: fly launch
- Run: fly deploy
- Create a Procfile with:
web: uvicorn src.api.routes:app --host 0.0.0.0 --port $PORT - Run: heroku create
- Run: git push heroku main
- FastAPI - Modern Python web framework
- Pandas - Data manipulation
- LangGraph - AI agent orchestration
- Plotly - Interactive charts
- SQLite/PostgreSQL - Database
- React 18 - UI framework
- Vite - Build tool
- Tailwind CSS - Styling
- Plotly.js - Chart rendering
- Recharts - Additional charts
The platform features a modern glassmorphism design with:
- Dark theme by default
- Interactive charts with tooltips
- Drag and drop file upload
- Custom analytics workspace
- Dashboard with pinned charts
Personal Use Only - This software is for personal use only. Commercial use is not permitted.
For questions or inquiries:
- Instagram: @prerich.brian
- Email: adefebrianpro@gmail.com
Built with love using FastAPI, React, and Plotly