Skip to content

tonglam/letletme_data

Repository files navigation

Overall

Letletme_data is a robust data service that fetches data from the Fantasy Premier League (FPL) servers, cleans and transforms the data, and then stores it in PostgreSQL and Redis, providing RESTful APIs for querying data. The service is built with TypeScript using functional programming principles, ensuring type safety and maintainability.

Key Features:

  • Real-time FPL data fetching and transformation
  • Clean, structured data through RESTful APIs
  • Efficient data persistence with PostgreSQL
  • Performance optimization using Redis caching
  • Type-safe implementation with strict TypeScript
  • Docker containerization for easy deployment

Architecture & Workflow

System Architecture

graph TB
    FPL[FPL API] --> |Raw Data| Fetcher[Data Fetcher]
    Fetcher --> |JSON| Transform[Data Transformer]
    Transform --> |Validated Data| Storage[Data Storage]
    Storage --> |Write| DB[(PostgreSQL)]
    Storage --> |Cache| Cache[(Redis)]
    API[REST API] --> |Read| Cache
    API --> |Read/Write| DB
    Cron[Cron Jobs] --> |Trigger| Fetcher

    subgraph Data Processing
        Fetcher
        Transform
        Storage
    end

    subgraph Data Access
        API
    end

    subgraph Scheduling
        Cron
    end
Loading

Data Flow

sequenceDiagram
    participant C as Cron Job
    participant F as Fetcher
    participant T as Transformer
    participant P as PostgreSQL
    participant R as Redis
    participant A as API

    C->>F: Trigger data fetch
    F->>FPL: Request data
    FPL-->>F: Return raw data
    F->>T: Pass raw JSON
    T->>T: Validate & transform
    T->>P: Store processed data
    T->>R: Cache frequently accessed data

    Note over A,R: API Data Access Flow
    A->>R: Check cache
    alt Cache hit
        R-->>A: Return cached data
    else Cache miss
        A->>P: Query database
        P-->>A: Return data
        A->>R: Update cache
    end
Loading

Data Processing Pipeline

flowchart LR
    A[Raw FPL Data] --> B[Validation Layer]
    B --> C[Transform Layer]
    C --> D[Business Logic]
    D --> E[Storage Layer]

    subgraph Validation
        B --> |zod| B1[Schema Validation]
    end

    subgraph Transform
        C --> |fp-ts| C1[Data Mapping]
        C --> |Either| C2[Error Handling]
    end

    subgraph Storage
        E --> E1[PostgreSQL]
        E --> E2[Redis Cache]
    end
Loading

Tech Stack

Core:

  • TypeScript (with strict type checking)
  • Node.js (v18+)
  • Bun (runtime & package manager)
  • ElysiaJS (REST API framework)

Storage:

  • PostgreSQL (primary database)
  • Redis (caching layer, via ioredis)
  • Drizzle ORM (type-safe ORM)

Testing & Quality:

  • Bun Test Runner (Jest-compatible)
  • ESLint & Prettier (code quality tools)

Utilities:

  • Pino (structured logging)
  • Zod (runtime type validation)
  • fp-ts (functional programming utilities)

DevOps:

  • Docker (containerization)
  • Docker Compose (multi-container orchestration)

Functional Programming

This project is designed using functional programming principles, making it particularly well-suited for data transformation workflows. The FP approach offers several benefits:

  1. Data Flow:

    • Clear, unidirectional data flow
    • Immutable data transformations
    • Pure functions for predictable results
    • Extensive use of fp-ts for functional patterns
  2. Type Safety:

    • Advanced TypeScript generics
    • No 'any' types allowed
    • Strong type inference
    • Runtime type validation with Zod
  3. Benefits:

    • Highly testable code
    • Easy to maintain and extend
    • Reduced side effects
    • Better error handling through Either and Option types
    • Composable functions

While the learning curve with TypeScript generics and FP patterns can be steep, especially coming from an OOP background, the resulting codebase is more maintainable, predictable, and elegant.

Getting Started

  1. Prerequisites:

    Node.js v18+
    Bun v1+
    Docker & Docker Compose
    PostgreSQL
    Redis
  2. Installation:

    git clone [repository-url]
    cd letletme_data
    bun install
  3. Configuration:

    cp .env.example .env
    # Edit .env with your settings
  4. Run:

    docker-compose up -d
    bun run dev

Domain-Driven Design

The project follows DDD principles with clear domain boundaries and type-safe implementations.

graph TB
    subgraph Core Domain
        Event[Event Domain]
        Player[Player Domain]
        Team[Team Domain]
        Entry[Entry Domain]
        League[League Domain]
    end

    subgraph Supporting Domains
        Scout[Scout Domain]
        Stats[Statistics Domain]
        Live[Live Domain]
    end

    Event --> Stats
    Player --> Stats
    Team --> Stats
    Entry --> Stats
    League --> Stats

    Stats --> Live
    Scout --> Live
Loading

Each domain follows a standard structure with entities, repositories, services, and types, ensuring clear separation of concerns and maintainable code. For detailed design documentation, please refer to the design docs.

Deployment

  • The production stack runs inside Docker containers orchestrated by docker compose; copy .env.deploy.example to .env.deploy, then run scripts/deploy.sh to build images, start services, and execute database migrations locally or on the VPS.
  • Continuous delivery is handled via GitHub Actions (.github/workflows/deploy.yml), which builds a container image, pushes it to GHCR, and refreshes the VPS stack using the same compose file.
  • Refer to DEPLOYMENT.md and docs/deployment-plan.md for the full checklist, required GitHub secrets, and the legacy manual instructions kept for break-glass scenarios.

Testing

  • Run bun test tests/unit for the fast, deterministic suite that validates transformers, repositories, and utilities (this is what the CI workflow executes).
  • Run bun test tests/integration locally before production releases; these tests require external services (PostgreSQL, Redis, Bull queues, live FPL responses) and are intentionally skipped in CI/CD.

About

A data analytics tool for Fantasy Premier League (FPL) managers, providing statistical insights, player performance trends, and team optimization through TypeScript and data visualization.

Resources

License

Stars

Watchers

Forks

Contributors

Languages