A comprehensive collection of realistic, runnable Tower apps demonstrating data engineering, analytics, and AI workflows for a fictional retail and logistics company.
Orbita Supply Co. is a mid-size omnichannel retail company with:
- Global e-commerce storefront
- 150+ physical stores worldwide
- Multiple warehouses with IoT sensor networks
- Extensive supply chain and logistics operations
- Customer support handling thousands of tickets daily
The company is modernizing its data platform using:
- Tower for app orchestration and workflows
- Apache Iceberg for data lakehouse storage
- dltHub for data ingestion
- dbt/SQLMesh for transformations
- Marimo for interactive notebooks and dashboards
- LLMs for AI-powered automation
tower-demo/
βββ lib/ # Shared library code
β βββ iceberg_utils.py # Iceberg I/O helpers (uses Tower tables API)
β βββ dlt_utils.py # dltHub ingestion helpers
β βββ orbita_common.py # Orbita constants and utilities
β βββ notifications.py # Slack/email notifications
β
βββ Ingestion Apps (5)
β βββ ingest_shopify_orders/
β βββ ingest_inventory_snapshots/
β βββ ingest_warehouse_telemetry/
β βββ ingest_product_catalog/
β βββ ingest_returns_rma/
β
βββ Transformation Apps (4)
β βββ run_dbt_models/
β βββ daily_inventory_ledger/
β βββ customer_360/
β βββ product_performance_models/
β
βββ Analytics/Dashboard Apps (4)
β βββ sales_dashboard/ # Marimo notebook
β βββ inventory_heatmap/ # Marimo notebook
β βββ order_funnel_analysis/ # Marimo notebook
β βββ returns_quality_insights/ # Marimo notebook
β
βββ Orchestration Pipelines (2)
β βββ daily_retail_pipeline/
β βββ warehouse_anomaly_pipeline/
β
βββ Data Generation (1)
βββ regenerate_demo_data/ # Generates sample data daily
- Latest Tower CLI installed (Installation guide)
- Python 3.10+
- uv for dependency management
# Pick an app to run
cd ingest_shopify_orders
# Install dependencies
uv sync
# Run locally
tower run --localNote: Iceberg catalog configuration is managed through Tower environments.
The lib/iceberg_utils helper uses Tower's tables() API which automatically
loads catalogs defined in your Tower environment.
# Deploy a single app
cd ingest_shopify_orders
tower deploy
# Deploy all apps (from repo root)
for app in ingest_* run_* customer_* product_* summarize_* generate_* warehouse_* daily_* regenerate_* sales_* order_* returns_* inventory_*; do
cd $app && tower deploy && cd ..
done
# Run an app
tower run ingest_shopify_orders
# View logs
tower logs ingest_shopify_ordersExtract data from source systems into the bronze layer:
- ingest_shopify_orders: Shopify e-commerce orders
- ingest_inventory_snapshots: Warehouse stock levels
- ingest_warehouse_telemetry: IoT sensor data
- ingest_product_catalog: Product master data
- ingest_returns_rma: Return and RMA records
Transform bronze data into analytics-ready tables:
- run_dbt_models: Execute dbt transformations
- daily_inventory_ledger: Daily inventory movements
- customer_360: Customer analytics and LTV
- product_performance_models: Product quality metrics
Interactive dashboards and exploration:
- sales_dashboard: Revenue and order metrics
- inventory_heatmap: Stock levels by warehouse
- order_funnel_analysis: Conversion funnel
- returns_quality_insights: Return patterns
Coordinate multiple apps into workflows:
- daily_retail_pipeline: Full daily ETL pipeline
- warehouse_anomaly_pipeline: Real-time anomaly response
Learn Tower features:
- secrets_example: Secrets management
- parameterized_app: Runtime parameters
- scheduled_job: Cron scheduling
βββββββββββββββββββββββββββββββββββββββββββ
β GOLD LAYER β
β Business-ready aggregates & KPIs β
β β’ customer_360 β
β β’ product_performance β
β β’ inventory_ledger β
βββββββββββββββββββββββββββββββββββββββββββ
β²
βββββββββββββββββββββββββββββββββββββββββββ
β SILVER LAYER β
β Cleaned, conformed, enriched data β
β β’ ticket_summaries β
β β’ product_descriptions β
β β’ anomaly_explanations β
βββββββββββββββββββββββββββββββββββββββββββ
β²
βββββββββββββββββββββββββββββββββββββββββββ
β BRONZE LAYER β
β Raw ingested data from sources β
β β’ orders β
β β’ inventory β
β β’ warehouse_telemetry β
β β’ products β
β β’ returns β
β β’ support_tickets β
βββββββββββββββββββββββββββββββββββββββββββ
Sources β Ingestion β Bronze β Transformations β Silver/Gold β Analytics
β β β β β β
Shopify dltHub Iceberg dbt Iceberg Marimo
APIs Tower SQLMesh Dashboards
IoT Tower β
Support LLMs β
Systems Business
Users
- Tower: App orchestration, scheduling, secrets management, catalog management
- Apache Iceberg: Open table format for data lakehouse
- dltHub: Python-first data ingestion framework
- dbt: SQL-based data transformations
- Marimo: Reactive Python notebooks
- PyArrow & Polars: Columnar data processing
- Claude: LLM for AI automation
- Slack: Notifications and alerting
tower run daily_retail_pipelineExecutes:
- Ingest orders, products, returns
- Build customer_360 and product_performance
- Generate AI sales report
- Send Slack notification
tower run warehouse_anomaly_pipelineExecutes:
- Ingest latest telemetry
- Detect anomalies
- Explain with AI
- Alert on critical issues
tower run regenerate_demo_dataExecutes:
- Generate 6 realistic sample data files using Faker
- Upload to S3 with public-read access
- Notify on completion
Runs automatically daily at 2 AM UTC to keep demo data fresh.
cd sales_dashboard
marimo run dashboard.pyOpens interactive dashboard in browser.
Set via tower secrets set:
# API Credentials
tower secrets set SHOPIFY_SHOP_NAME "orbita-supply"
tower secrets set SHOPIFY_API_KEY "your-key"
# AWS Credentials (for regenerate_demo_data)
tower secrets set TOWER_DEMO_AWS_ACCESS_KEY_ID "your-access-key"
tower secrets set TOWER_DEMO_AWS_SECRET_ACCESS_KEY "your-secret-key"
# Notifications
tower secrets set SLACK_WEBHOOK_URL "https://hooks.slack.com/..."
# LLM API (provided automatically by Tower)
# TOWER_LLM_API_KEY is injected by Tower runtimeIceberg catalogs are configured per Tower environment (not via secrets):
# Configure catalog for your environment
# This is typically done through Tower UI or CLI
tower catalog create orbita-lakehouse \
--type polaris \
--warehouse s3://orbita-lakehouse/warehouse
# Or for Snowflake Open Catalog
tower catalog create orbita-lakehouse \
--type snowflake \
--account your-account \
--database iceberg_dbApps automatically use the catalog defined in the default environment or the current environment.
export AWS_REGION="us-west-2"
export ENVIRONMENT="production"
export LOG_LEVEL="INFO"- Tower Documentation: https://docs.tower.dev
- Claude Code: Compatible with Tower workflows
- Sample Data: Located in
data/directory
Show complete data flow from source to insight:
- Run ingestion apps β show bronze tables
- Run transformations β show silver/gold tables
- Open Marimo dashboard β show live analytics
- Run LLM app β show AI automation
Demonstrate operational intelligence:
- Simulate warehouse anomaly
- Run anomaly pipeline
- Show AI explanation
- Display Slack alert
Highlight Tower features:
- Show Towerfile syntax
- Deploy app with CLI
- Run with parameters
- View logs and monitoring
This demo repository follows the Orbita Supply Co. theme. When adding apps:
- Use realistic retail/logistics scenarios
- Keep data volumes small for demos
- Document all dependencies
- Follow existing naming conventions
- Add sample data if needed
This demo repository is provided for educational and demonstration purposes.
Built with Tower β’ tower.dev