This repository contains an implementation of DropGNN, a graph neural network approach with DropNode regularization for drug discovery tasks, based on the research paper "DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks".
DropGNN introduces random dropouts of entire nodes during training to improve the expressiveness and performance of Graph Neural Networks (GNNs) on molecular property prediction tasks. Key features:
- Implements DropNode regularization for GNNs
- Improves model generalization
- Enhances expressiveness of message passing GNNs
- Particularly effective for drug discovery applications
The original paper demonstrates that DropGNN achieves state-of-the-art performance on several molecular property prediction benchmarks.
- Clone this repository:
git clone https://github.com/JaanuNan/DropGNN.git
cd dropgnn.ipynb- Create and activate a virtual environment (recommended):
python -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`The main implementation is provided in DropGNN.ipynb. Open it with Jupyter:
jupyter notebook dropgnn.ipynbAlternatively, you can use the standalone Python implementation:
python dropgnn.pyModify the hyperparameters in the notebook/script to experiment with different settings:
- Dropout rate
- Number of GNN layers
- Hidden layer dimensions
- Learning rate
- Training epochs
dropgnn-implementation/
βββ dropgnn.ipynb # Main implementation notebook
βββ dropgnn.py # Python implementation
βββ documentation # Implementation documentation
βββ Research paper # Paper Pdf
βββ LICENSE
Contributions are welcome! Please open an issue or submit a pull request for any improvements.
If you use this implementation in your research, please cite the original paper:
@article{papp2021dropgnn,
title={DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks},
author={P{\'a}l Andras Papp and Roger Wattenhofer},
journal={arXiv preprint arXiv:2111.06283},
year={2021}
}This project is licensed under the MIT License - see the LICENSE file for details.
