Skip to content

Research Question: How does increasing sparsity in the weight matrices of neural networks affect training dynamics, model accuracy, and generalization?

Notifications You must be signed in to change notification settings

annadiack/IDP_project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 

Repository files navigation

Effect of Sparsity in Weight Matrices in Neural Networks

Deep Learning Project – University of Basel (Fall 2025)


Overview

This project investigates how sparse weight matrices influence the behavior, efficiency, and performance of neural networks.
We systematically apply different sparsity levels to a fully connected neural network and analyze:

  • Training dynamics
  • Final model accuracy & generalization
  • Computation & memory efficiency
  • Gradient flow and stability

The project is part of the Foundations in Deep Learning course and is implemented in PyTorch.


Research Questions

  1. How does sparsity influence model accuracy?
  2. Up to which sparsity level can a model perform similarly to a dense network?
  3. What is the effect on gradient norms, convergence, and stability?
  4. Does sparsity improve or worsen generalization?
  5. How much training time and memory can be saved?

Repository Structure


Project Checklist

About

Research Question: How does increasing sparsity in the weight matrices of neural networks affect training dynamics, model accuracy, and generalization?

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages