Skip to content

Pampaj7/DeepGreen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DeepGreen AI 🌱

This repository contains the replication package for the paper:

Deep Green AI: Energy Efficiency of Deep Learning across Programming Languages and Frameworks
Leonardo Pampaloni, Marco Pagliocca, Enrico Vicario, Roberto Verdecchia
University of Florence, Italy

📄 Preprint PDF


📌 Overview

DeepGreen AI is an empirical study investigating how programming languages and frameworks influence the energy efficiency of deep learning (DL) workloads.
We benchmarked two canonical CNN architectures (ResNet-18 and VGG-16) across six programming languages (Python, C++, Java, R, MATLAB, Rust), multiple frameworks (PyTorch, TensorFlow, JAX, LibTorch, Burn, Deeplearning4j, etc.), and three datasets of increasing complexity (Fashion-MNIST, CIFAR-100, Tiny ImageNet).

Experiments were executed on a dedicated NVIDIA L40S GPU server, with energy usage measured via the CodeCarbon toolkit.


🔬 Research Questions

  • RQ1.1: How does programming language choice affect the energy efficiency of DL training?
  • RQ1.2: How does programming language choice affect the energy efficiency of DL inference?

🔦 Highlights

  • Machine-code compiled languages (Rust, C++) are consistently more energy-efficient during training.
  • Mature Python frameworks (PyTorch) achieve competitive efficiency despite interpretation overhead.
  • High-level languages (TensorFlow, Java, R) incur substantial overheads if they are unable to exploit the available hardware resources.
  • Inference vs training efficiency diverge: C++ and PyTorch dominate inference, Rust dominates training.
  • Faster $\neq$ Greener: execution time is not a reliable proxy for energy usage.

📂 Repository Structure

Java/deepgreen-dl4j/    # Java implementations (Deeplearning4j)
cpp/                    # C++ implementations (LibTorch)
julia/                  # Julia implementations (Flux, Lux)
matlab/                 # MATLAB scripts (TF/Keras wrappers)
python/                 # Python (PyTorch, TensorFlow, JAX)
R/                      # R (TensorFlow wrapper)
rust/                   # Rust (Burn)
dataloader/             # Unified data loading utilities
data/                   # Dataset links and preprocessing scripts
results/                # Experimental results (CSV, logs, figures)
README.md               # This file

⚙️ Setup

1. Clone the repository

git clone https://github.com/Pampaj7/DeepGreen.git
cd DeepGreen

2. Python environment

conda env create -f environment.yml
conda activate deepgreen

3. Datasets

Download datasets (Fashion-MNIST, CIFAR-100, Tiny ImageNet) using the Python scripts provided in dataloader.


📊 Replication Package

This replication package includes:

  1. Source code for all implementations (Python, C++, Java, R, MATLAB, Rust).
  2. Scripts for automated training and inference runs.
  3. Environment specifications for each ecosystem.
  4. Raw energy logs and aggregated CSV data.
  5. Plotting scripts to reproduce all figures and tables from the paper.

📖 Citation

If you use this package, please cite:

@article{pampaloni2025deepgreen,
  title   = {Deep Green AI: Energy Efficiency of Deep Learning across Programming Languages and Frameworks},
  author  = {Pampaloni, Leonardo and Pagliocca, Marco and Vicario, Enrico and Verdecchia, Roberto},
  journal = {Preprint},
  year    = {2025},
  doi     = {10.5281/zenodo.xxxxxxx}
}

📜 License

This project is released under the MIT License.

About

Exploring deep algorithms energy efficiency across different languages

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •