Bridgette is a comprehensive financial data processing platform designed to bridge the gap between traditional banking and modern digital solutions. It provides secure, efficient, and user-friendly banking services with AI-powered schema matching capabilities.
- π€ AI-Powered Schema Matching: Uses OpenAI API for intelligent mapping between different bank data formats
- π Multi-Format Support: Handles CSV, Excel (.xlsx, .xls) files with up to 50MB per file
- π Real-Time Processing: Live file upload with immediate feedback and progress tracking
- π± Responsive Design: Beautiful, modern UI that works on desktop, tablet, and mobile
- π‘οΈ Secure Processing: Local file processing with no data transmission to third parties
- β‘ Fallback Mechanisms: Reliable operation even when AI services are unavailable
- π Excel Generation: Creates unified Excel files from processed data
- π¨ Modern UI/UX: Smooth animations, drag-and-drop uploads, and intuitive navigation
- API Server: RESTful endpoints for file processing and data management
- Schema Analysis: Intelligent mapping between different bank data formats
- OpenAI Integration: AI-powered schema matching and data processing
- File Management: Organized storage with bank-specific directories
- Excel Generation: Creates unified output files with customer data consolidation
- Responsive Design: Mobile-first approach with modern CSS Grid and Flexbox
- Interactive UI: Drag-and-drop file uploads with real-time feedback
- API Communication: Seamless backend integration with error handling
- Progressive Enhancement: Works without JavaScript for basic functionality
- Cache Management: Intelligent caching with version control for updates
- Python 3.11+ with pip
- Node.js 18+ (for development tools)
- OpenAI API Key (for AI features)
- Modern web browser (Chrome, Firefox, Safari, Edge)
# Clone the repository
git clone https://github.com/your-username/bridgette.git
cd bridgette
# Install Python dependencies
cd backend
pip install -r requirements.txt
cd ..
# Install Node.js dependencies (optional, for development)
npm install# Copy environment template
cp env.example .env
# Edit .env file with your OpenAI API key
OPENAI_API_KEY=your_openai_api_key_here
FLASK_DEBUG=True
FLASK_HOST=0.0.0.0# Windows
start_servers.bat
# Linux/Mac
./start_servers.sh# Terminal 1: Backend Server
cd backend
python app.py
# Terminal 2: Frontend Server
cd frontend
python -m http.server 8080- Frontend: http://localhost:8080
- Backend API: http://localhost:5001
- Health Check: http://localhost:5001/api/health
- Bank Schema Files: Upload schema files that define the mapping between different bank formats
- Drag & Drop: Simply drag files onto the upload areas
- Multiple Files: Upload multiple schema files for comprehensive mapping
- Regular Data Files: Upload your actual financial data files
- Format Support: CSV, Excel (.xlsx, .xls) files
- File Size: Up to 50MB per file
- Unlimited Count: Upload as many files as needed
- AI Processing: The system uses OpenAI API to intelligently match schemas
- Fallback Mode: If AI is unavailable, uses rule-based matching
- Real-Time Feedback: Progress indicators and status updates
- Error Handling: Clear error messages and recovery suggestions
- Excel Output: Download unified Excel file with processed data
- Customer Consolidation: Data is merged and organized by customer
- Automatic Cleanup: Files are automatically cleaned up after download
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/health |
Health check and server status |
POST |
/api/process-files |
Upload and process files |
POST |
/api/trigger-main-processing |
Start AI-powered processing |
POST |
/api/download-files |
Get download link for processed files |
GET |
/api/download-excel/<filename> |
Download specific Excel file |
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/uploaded-files |
List all uploaded files |
GET |
/api/uploaded-files/<filename> |
Get specific file info |
DELETE |
/api/uploaded-files/<filename> |
Delete uploaded file |
POST |
/api/cleanup-json-files |
Clean up temporary files |
?schema=true- Process as schema files?box=1or?box=2- Specify upload box (bank1 or bank2)
bridgette/
βββ π backend/ # Backend API server
β βββ π app.py # Main Flask application
β βββ π main.py # Core processing logic
β βββ π config.py # Configuration management
β βββ π requirements.txt # Python dependencies
β βββ π wsgi.py # WSGI configuration
β βββ π uploaded_files/ # File storage
β βββ π bank1/ # Bank 1 files
β βββ π bank2/ # Bank 2 files
βββ π frontend/ # Frontend application
β βββ π index.html # Main HTML file
β βββ π script.js # JavaScript functionality
β βββ π styles.css # CSS styles
β βββ π server.py # Development server
β βββ π images/ # Assets
βββ π schema_analysis/ # Schema analysis results
βββ π .env # Environment variables
βββ π docker-compose.yml # Docker configuration
βββ π Dockerfile # Docker image
βββ π README.md # This file
# Start development servers
npm run dev
# Run backend only
cd backend && python app.py
# Run frontend only
cd frontend && python server.py
# Run with cache-busting
cd frontend && python server.py
# Install dependencies
pip install -r backend/requirements.txt
npm install- Comprehensive Comments: All code is thoroughly documented
- Error Handling: Graceful error handling with user feedback
- Type Safety: Python type hints where applicable
- Security: Input validation and sanitization
- Performance: Optimized for large file processing
# 1. Connect GitHub repository to Vercel
# 2. Set environment variables in Vercel dashboard:
# - OPENAI_API_KEY=your_key_here
# - FLASK_DEBUG=false
# 3. Deploy automatically# Build and run with Docker Compose
docker-compose up --build
# Or build manually
docker build -t bridgette .
docker run -p 5000:5000 bridgette# 1. Upload files to your hosting provider
# 2. Install Python dependencies
pip install -r backend/requirements.txt
# 3. Configure WSGI using backend/wsgi.py
# 4. Set environment variables
export FLASK_DEBUG=false
export PORT=5000| Variable | Description | Default | Required |
|---|---|---|---|
OPENAI_API_KEY |
OpenAI API key for AI features | - | Yes |
FLASK_DEBUG |
Enable debug mode | false |
No |
FLASK_HOST |
Host to bind to | 0.0.0.0 |
No |
PORT |
Port to run on | 5001 |
No |
- Local Processing: All data processing happens locally
- No Data Storage: Files are processed and immediately cleaned up
- Secure API Keys: Environment variable management
- Input Validation: Comprehensive file type and size validation
- Error Handling: Secure error messages without data exposure
- No Third-Party Tracking: No analytics or tracking scripts
- Local File Handling: Files never leave your server
- Automatic Cleanup: Temporary files are automatically removed
- Secure Headers: CORS and security headers configured
# Check Python version
python --version # Should be 3.11+
# Install dependencies
pip install -r backend/requirements.txt
# Check environment variables
echo $OPENAI_API_KEY# Clear browser cache
Ctrl + Shift + R (Windows/Linux)
Cmd + Shift + R (Mac)
# Use cache-busting server
cd frontend && python server.py- File Size: Ensure files are under 50MB
- File Format: Only CSV, Excel (.xlsx, .xls) supported
- Browser Compatibility: Use modern browsers (Chrome, Firefox, Safari, Edge)
- API Key: Verify OpenAI API key is set correctly
- Network: Check internet connection for API calls
- Fallback: System will use rule-based matching if AI fails
# Enable debug mode
export FLASK_DEBUG=True
# Check logs
tail -f backend/logs/app.logWe welcome contributions! Please follow these steps:
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature - Make your changes: Follow the existing code style
- Add tests: Ensure your changes work correctly
- Commit changes:
git commit -m 'Add amazing feature' - Push to branch:
git push origin feature/amazing-feature - Open a Pull Request: Describe your changes clearly
- Code Comments: Add comprehensive comments for new code
- Error Handling: Include proper error handling and user feedback
- Testing: Test your changes thoroughly
- Documentation: Update documentation for new features
This project is licensed under the ISC License - see the LICENSE file for details.
- GitHub Issues: Create an issue
- Documentation: Check this README and inline code comments
- Email: info@bridgette.com
- Phone: (647) 390 4658
- Address: 145 Columbia St W, Waterloo, ON N2L 3J5
- Email: info@bridgette.com
- Website: bridgette.com
- OpenAI: For providing intelligent schema matching capabilities
- Flask Community: For the excellent web framework
- Pandas Team: For powerful data processing capabilities
- Font Awesome: For beautiful icons
- Contributors: All developers who have contributed to this project
Bridgette - Bridging the gap between traditional banking and modern digital solutions
Made with β€οΈ for the financial community
