Skip to content

Latest commit

 

History

History
67 lines (49 loc) · 3.19 KB

File metadata and controls

67 lines (49 loc) · 3.19 KB

🚚 Supply Chain Control Tower & Automated Auditing

SQL Data Engineering Business Intelligence

📋 Executive Summary

This project implements an Enterprise-Grade Logistics Database designed to manage high-volume supply chain operations with strict governance.

Moving beyond basic data retrieval, this architecture enforces ACID compliance for data integrity and features an Event-Driven Architecture using Triggers for automated security auditing. It serves as the "Business Logic Layer" (Layer 3) of my portfolio, storing historical data and generating insights from the alerts produced by the Digital Twin AI (Python).


🖥️ Analytics Dashboard Preview

(Real output showing Driver KPIs and Route Risk Classification)

🏗️ Technical Architecture

The database is architected to simulate a production environment for a logistics firm, focusing on Automation and Security:

  • 🛡️ Automated "Black Box" Auditing: Implements database TRIGGERS that act as watchdogs. Any change to a shipment status automatically generates a timestamped entry in a separate audit_logs table, ensuring 100% traceability without human intervention.
  • ⚡ Performance Tuning: Utilizes INDEXING on high-cardinality columns (Status, Driver ID) to optimize query performance for large datasets.
  • 🧠 Time-Series Intelligence: Uses Advanced Window Functions (LAG, LEAD) to perform inter-row calculations, determining "Driver Idle Time" between trips directly within the database engine.

💻 Key Engineering Features

  1. Event-Driven Automation (Triggers):

    CREATE TRIGGER after_shipment_update
    AFTER UPDATE ON shipments
    BEGIN
        -- Automatically captures the "Before" and "After" state
        INSERT INTO audit_logs (...) VALUES (OLD.status, NEW.status, ...);
    END;
  2. Transactional Integrity (ACID):

    BEGIN TRANSACTION;
    -- Complex multi-table operations
    COMMIT;

    Guarantees that financial and logistical data remains consistent even during system crashes.

  3. Data Quality Constraints:

    • CHECK (cargo_weight_kg > 0): Prevents physical impossibilities (negative weight).
    • FOREIGN KEY: Enforces relational integrity between Drivers and Shipments.

⚙️ How to Run

This script is compliant with ANSI SQL standards and optimized for SQLite (portability) but compatible with PostgreSQL/MySQL.

  1. Environment: Open VS Code.
  2. Extension: Install "SQLite" (by alexcvzz).
  3. Execution:
    • Open supply_chain.sql.
    • Press Ctrl + Shift + P > Select SQLite: Run Query.
    • A virtual table view will appear with the generated reports.

🔗 Integration Ecosystem

  • Upstream: Receives critical failure alerts from [Digital Twin AI].
  • Downstream: Feeds cleaned data into Power BI / Tableau dashboards.