This repository contains the replication package for the paper:
Deep Green AI: Energy Efficiency of Deep Learning across Programming Languages and Frameworks
Leonardo Pampaloni, Marco Pagliocca, Enrico Vicario, Roberto Verdecchia
University of Florence, Italy
DeepGreen AI is an empirical study investigating how programming languages and frameworks influence the energy efficiency of deep learning (DL) workloads.
We benchmarked two canonical CNN architectures (ResNet-18 and VGG-16) across six programming languages (Python, C++, Java, R, MATLAB, Rust), multiple frameworks (PyTorch, TensorFlow, JAX, LibTorch, Burn, Deeplearning4j, etc.), and three datasets of increasing complexity (Fashion-MNIST, CIFAR-100, Tiny ImageNet).
Experiments were executed on a dedicated NVIDIA L40S GPU server, with energy usage measured via the CodeCarbon toolkit.
- RQ1.1: How does programming language choice affect the energy efficiency of DL training?
- RQ1.2: How does programming language choice affect the energy efficiency of DL inference?
- Machine-code compiled languages (Rust, C++) are consistently more energy-efficient during training.
- Mature Python frameworks (PyTorch) achieve competitive efficiency despite interpretation overhead.
- High-level languages (TensorFlow, Java, R) incur substantial overheads if they are unable to exploit the available hardware resources.
- Inference vs training efficiency diverge: C++ and PyTorch dominate inference, Rust dominates training.
-
Faster
$\neq$ Greener: execution time is not a reliable proxy for energy usage.
Java/deepgreen-dl4j/ # Java implementations (Deeplearning4j)
cpp/ # C++ implementations (LibTorch)
julia/ # Julia implementations (Flux, Lux)
matlab/ # MATLAB scripts (TF/Keras wrappers)
python/ # Python (PyTorch, TensorFlow, JAX)
R/ # R (TensorFlow wrapper)
rust/ # Rust (Burn)
dataloader/ # Unified data loading utilities
data/ # Dataset links and preprocessing scripts
results/ # Experimental results (CSV, logs, figures)
README.md # This file
git clone https://github.com/Pampaj7/DeepGreen.git
cd DeepGreenconda env create -f environment.yml
conda activate deepgreenDownload datasets (Fashion-MNIST, CIFAR-100, Tiny ImageNet) using the Python scripts provided in dataloader.
This replication package includes:
- Source code for all implementations (Python, C++, Java, R, MATLAB, Rust).
- Scripts for automated training and inference runs.
- Environment specifications for each ecosystem.
- Raw energy logs and aggregated CSV data.
- Plotting scripts to reproduce all figures and tables from the paper.
If you use this package, please cite:
@article{pampaloni2025deepgreen,
title = {Deep Green AI: Energy Efficiency of Deep Learning across Programming Languages and Frameworks},
author = {Pampaloni, Leonardo and Pagliocca, Marco and Vicario, Enrico and Verdecchia, Roberto},
journal = {Preprint},
year = {2025},
doi = {10.5281/zenodo.xxxxxxx}
}This project is released under the MIT License.