Live Site: https://covid19dashboards.com
Status: Historical Archive (Last updated: March 2020)
A data-driven dashboard that automatically visualizes COVID-19 statistics using Jupyter Notebooks. This project leverages the fastpages framework to convert Python notebooks directly into an interactive website, with daily automated updates via GitHub Actions.
This repository contains a collection of Jupyter Notebooks that perform Exploratory Data Analysis (EDA), statistical modeling, and visualization of COVID-19 data. Unlike traditional web apps, no front-end code is required; every chart and table you see on the live site is generated purely from Python code in the _notebooks directory.
- Automated Pipeline: GitHub Actions fetch fresh data, executes notebooks, and rebuilds the site daily.
- Notebook-First Architecture: The entire website content is derived from
.ipynbfiles. - Multi-Source Data Integration: Aggregates data from Johns Hopkins University and The New York Times.
- Advanced Analytics: Includes Bayesian growth modeling, trajectory comparisons, and per-capita analysis.
The _notebooks directory contains the following specific analyses (generated dynamically):
| Notebook | Description |
|---|---|
| Overview | Global, US, and Europe-specific summary dashboards with key metrics. |
| Growth Analysis | Logarithmic growth charts and curve fitting for confirmed cases. |
| Trajectory Comparison | Compares outbreak timelines across different countries/states (aligned by "Day 100"). |
| Bayesian Modeling | Probabilistic estimates of case growth and doubling times. |
| US State Maps | Choropleth visualizations of case density and growth by US state. |
| Per Capita Stats | Normalized data showing cases and deaths per million people. |
| Death Trajectories | Comparative analysis of fatality rates over time. |
| Undercount Estimation | Statistical models estimating the true number of infections vs. reported cases. |
- Framework: fastpages (Fast.ai)
- Language: Python 3.7+ (99.9% of codebase)
- Rendering Engine: Jupyter Notebooks β HTML via
nbconvert - Site Generator: Jekyll (Ruby) with Minima theme
- Containerization: Docker & Docker Compose
The project ingests data directly from public repositories at runtime:
- Global Data: Johns Hopkins University CSSE
- Files:
time_series_covid19_confirmed_global.csv,deaths,recovered.
- Files:
- US Specific Data: The New York Times COVID-19 Data
- File:
us-states.csv
- File:
- Metadata: Country mapping and continent classification (via external CSV).
load_covid_data.py: Handles fetching and merging JHU and NYT datasets.covid_overview.py: Generates summary statistics and KPI cards.
Since this project relies on a specific Docker environment to mimic the GitHub Actions workflow, local development requires Docker.
- Docker & Docker Compose installed
- Git
-
Clone the Repository
git clone https://github.com/wissamismail/covid19-dashboard.git cd covid19-dashboard -
Start the Development Server This command builds the Docker images, converts the notebooks, and starts the Jekyll server.
make server # OR explicitly with docker-compose docker-compose up -
View the Site Open your browser to
http://localhost:4000. -
Hot Reloading The
watcherservice is configured to detect changes in.ipynbfiles. When you save a notebook in the_notebooksfolder, the container will automatically re-convert it and refresh the page.
If you only want to run the data analysis without the website:
pip install pandas numpy matplotlib seaborn plotly jupyter
jupyter lab _notebooksNote: You may need to install specific dependencies found in the notebook headers.
.
βββ _notebooks/ # SOURCE OF TRUTH: All dashboards and logic live here
β βββ load_covid_data.py # Data ingestion utilities
β βββ 2020-03-XX-*.ipynb # Daily reports and specific analyses
β βββ my_icons/ # Custom assets for plots
βββ _pages/ # Static informational pages (About, Contributing)
βββ _action_files/ # GitHub Actions scripts for CI/CD automation
βββ assets/ # Images and static resources
βββ docker-compose.yml # Local development environment config
βββ Makefile # Shortcuts for Docker commands
βββ _config.yml # Jekyll/Fastpages site configuration
βββ Gemfile # Ruby dependencies for Jekyll
The site updates automatically via GitHub Actions:
- Trigger: Scheduled daily or on push to
master. - Fetch: Python scripts pull the latest CSVs from JHU and NYT.
- Execute: All notebooks in
_notebooksare executed in order. - Convert: Outputs (HTML/Charts) are embedded into the site.
- Deploy: The updated site is pushed to GitHub Pages.
To add a new visualization:
- Create a new Jupyter Notebook in
_notebooks/. - Name it strictly as
YYYY-MM-DD-Title-Of-Analysis.ipynb. - Add the following YAML front-matter to the top of the first cell (as raw markdown):
--- title: "My New Analysis" description: "Brief description of the chart" image: "thumbnail.png" # Optional search_exclude: true # Set to false if you want it searchable ---
- Commit and push. GitHub Actions will handle the rest.
Distributed under the Apache 2.0 License. See LICENSE for details.
- Fastpages Team: For the revolutionary notebook-to-blog framework.
- Data Providers: Johns Hopkins University CSSE and The New York Times for open data access.
- Open Source Contributors: Everyone who contributed notebooks to this historical archive.
Note: This repository serves as a historical snapshot of the early pandemic response (March 2020). Data sources may no longer be active or updated.