Skip to content

Latest commit

 

History

History
316 lines (259 loc) · 8.55 KB

File metadata and controls

316 lines (259 loc) · 8.55 KB

Production Setup Guide

Overview

This guide walks you through setting up the production-scale BlueDot Trading System for processing 1000+ JSON files automatically.

📋 Prerequisites

Required Accounts

  • GitHub Account (for automation and hosting)
  • Google Account (for Google Drive storage)
  • TradingView Account (for Pine Script integration)
  • Slack/Discord (optional, for notifications)

Required Skills

  • Basic command line usage
  • Basic Git operations
  • Understanding of JSON/CSV formats

🔧 Step-by-Step Setup

1. Repository Setup

Fork the Repository

# 1. Go to GitHub and fork this repository
# 2. Clone your fork locally
git clone https://github.com/YOUR_USERNAME/BlueDot-Trading-System.git
cd BlueDot-Trading-System

# 3. Enable GitHub Actions in your repository settings
# 4. Enable GitHub Pages (Settings → Pages → Source: Deploy from a branch → gh-pages)

2. Google Drive Setup

Create Folder Structure

📁 BlueDot-Trading-Data/
├── 📅 daily/
│   ├── 2024-08-01/
│   │   ├── AAPL_daily.json
│   │   ├── MSFT_daily.json
│   │   └── ... (1000+ JSON files)
│   ├── 2024-08-02/
│   └── latest/ → 2024-08-02/
├── 📊 weekly/
│   ├── 2024-W31/
│   │   ├── AAPL_weekly.json
│   │   ├── MSFT_weekly.json
│   │   └── ... (1000+ JSON files)
│   └── latest/ → 2024-W31/
└── 🔄 triggers/
    ├── daily_ready.txt      # Contains: "2024-08-02"
    ├── weekly_ready.txt     # Contains: "2024-W31"
    └── status_log.txt

Create Service Account

# 1. Go to Google Cloud Console
# 2. Enable Google Drive API
# 3. Create Service Account:
#    - Name: bluedot-drive-access
#    - Download JSON key file
# 4. Share your Google Drive folders with the service account email

Get Folder IDs

# 1. Open your Google Drive folder in browser
# 2. Copy the folder ID from URL:
#    https://drive.google.com/drive/folders/[FOLDER_ID_HERE]
# 3. Note down folder IDs for daily and weekly folders

3. GitHub Secrets Configuration

Required Secrets

Go to your GitHub repository → Settings → Secrets and variables → Actions

# Google Drive Integration
GOOGLE_DRIVE_SERVICE_ACCOUNT  # Base64-encoded service account JSON
DAILY_FOLDER_ID               # Google Drive folder ID for daily data
WEEKLY_FOLDER_ID              # Google Drive folder ID for weekly data

# Notifications (Optional)
SLACK_WEBHOOK_URL             # Slack webhook for notifications
DISCORD_WEBHOOK_URL           # Discord webhook for notifications

# TradingView
TRADINGVIEW_NAMESPACE         # Your TradingView namespace for seed data

Encode Service Account JSON

# Base64 encode your service account JSON file:
base64 -i google-drive-service-account.json | pbcopy  # macOS
base64 google-drive-service-account.json | xclip      # Linux

Setting Up Webhook Notifications (Optional)

For Slack:

# 1. Go to your Slack workspace
# 2. Create new app: https://api.slack.com/apps
# 3. Enable "Incoming Webhooks"
# 4. Add webhook to workspace
# 5. Copy webhook URL to GitHub secrets

For Discord:

# 1. Go to your Discord server
# 2. Server Settings → Integrations → Webhooks
# 3. Create New Webhook
# 4. Copy webhook URL to GitHub secrets

4. Testing the Setup

Test with Sample Data

# 1. Upload sample JSONs to Google Drive test folder
# 2. Trigger GitHub Action manually:
#    GitHub Repository → Actions → Daily Data Pipeline → Run workflow

# 3. Check the logs for processing status
# 4. Verify CSV files are generated in GitHub Pages

Verify GitHub Pages Output

# Check if your CSVs are accessible at:
# https://YOUR_USERNAME.github.io/BlueDot-Trading-System/data/daily/AAPL_BLUE_DOTS.csv
# https://YOUR_USERNAME.github.io/BlueDot-Trading-System/data/weekly/AAPL_BLUE_DOTS.csv

📊 Daily Operations Workflow

Morning Routine (Automated)

# 1. Upload your JSON files to Google Drive:
#    📁 daily/2024-11-12/
#       ├── AAPL.json
#       ├── MSFT.json
#       └── ... (more files)
#    
# 2. GitHub Actions runs automatically at 9 AM UTC
# 3. Or trigger manually from GitHub Actions tab
# 4. Receive Slack/Discord notification when complete
# 5. Use processed data in TradingView immediately

Weekly Routine (Automated)

# 1. Upload weekly JSON files to:
#    📁 weekly/2024-W46/
#       ├── AAPL.json
#       ├── MSFT.json
#       └── ... (more files)
#    
# 2. GitHub Actions runs every Sunday at 10 AM UTC
# 3. Or trigger manually with week number (e.g., 2024-W46)

🎯 TradingView Integration

Use GitHub Pages Data Directly

# Your data is automatically available via GitHub Pages
# No manual upload needed - TradingView can access it directly:

blue_dot = request.seed('your_namespace_daily_AAPL', 'BLUE_DOTS', close)
rlst = request.seed('your_namespace_daily_AAPL', 'RLST_RATING', close)

Pine Script Template

//@version=5
strategy("BlueDot Automated Strategy", overlay=true)

// Input for stock selection
symbol_input = input.string("AAPL", "Stock Symbol")
timeframe_input = input.string("daily", "Timeframe", options=["daily", "weekly"])

// Dynamic data access
namespace = "your_namespace_" + timeframe_input + "_" + symbol_input
blue_dot = request.seed(namespace, 'BLUE_DOTS', close)
rlst = request.seed(namespace, 'RLST_RATING', close)
bc = request.seed(namespace, 'BC_INDICATOR', close)

// Strategy logic
buy_signal = blue_dot == 1 and rlst > 80 and bc > 25000
sell_signal = rlst < 30 or bc < bc[5]

if buy_signal
    strategy.entry("Long", strategy.long)
if sell_signal
    strategy.close("Long")

🔍 Monitoring & Troubleshooting

Check Processing Status

# 1. GitHub Repository → Actions → View workflow runs
# 2. Check processing logs for errors
# 3. Verify notification delivery
# 4. Check GitHub Pages deployment

Common Issues

"No files found"

# Check:
# 1. Google Drive folder structure is correct
# 2. API key has proper permissions
# 3. Folder IDs are correct in GitHub secrets

"Processing failed"

# Check:
# 1. JSON file format matches expected structure
# 2. All required fields (rlst, bc, blueDotData) are present
# 3. GitHub Actions runtime limits (6 hours max)

"TradingView can't access data"

# Check:
# 1. GitHub Pages is enabled and deployed
# 2. CSV files are in correct format
# 3. request.seed() namespace is correct

Performance Optimization

For Large Datasets (2000+ files)

# 1. Split into multiple batches
# 2. Increase parallel workers in config
# 3. Consider multiple repositories for different timeframes

Storage Management

# 1. Monitor GitHub repository size (2GB limit)
# 2. Archive old data periodically
# 3. Use compression for large CSV files

🚨 Backup & Recovery

Data Backup Strategy

# 1. Keep original JSONs in Google Drive (permanent storage)
# 2. GitHub provides version control for processed CSVs
# 3. Export critical data periodically for offline backup

Disaster Recovery

# 1. Fork/clone repository to new account if needed
# 2. Re-upload JSONs to new Google Drive folder
# 3. Update GitHub secrets with new credentials
# 4. Trigger fresh processing run

📈 Scaling Considerations

Current Limits

  • GitHub Actions: 2000 minutes/month free, 6 hours per run
  • GitHub Storage: 2GB per repository
  • GitHub Pages: 100GB bandwidth/month
  • Google Drive API: 1 billion requests/day free

Scaling Options

  1. Multiple Repositories: Split by timeframe or market
  2. Cloud Functions: For unlimited processing time
  3. Self-Hosted Runners: For custom hardware requirements
  4. CDN Integration: For faster CSV access globally

🎓 Advanced Configuration

Custom Processing Rules

# Edit config/data_config.yaml
batch_processing:
  max_files_per_batch: 200      # Increase for more files per batch
  parallel_workers: 8           # More workers for faster processing
  
signals:
  bc_indicator:
    typical_range: [20000, 30000]  # Adjust based on your data

Custom Notifications

# Edit src/monitoring/notification_system.py
# Add custom notification channels
# Customize message formats
# Add health monitoring alerts

Next Steps:

  1. Complete this setup guide
  2. Test with small dataset first
  3. Scale up to full 1000+ files
  4. Integrate with your trading strategies
  5. Monitor and optimize performance