Skip to content

JaanuNan/DropGNN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

DropGNN: Graph Neural Networks with DropNode for Drug Discovery

License arXiv

This repository contains an implementation of DropGNN, a graph neural network approach with DropNode regularization for drug discovery tasks, based on the research paper "DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks".

Table of Contents

Overview

DropGNN introduces random dropouts of entire nodes during training to improve the expressiveness and performance of Graph Neural Networks (GNNs) on molecular property prediction tasks. Key features:

  • Implements DropNode regularization for GNNs
  • Improves model generalization
  • Enhances expressiveness of message passing GNNs
  • Particularly effective for drug discovery applications

The original paper demonstrates that DropGNN achieves state-of-the-art performance on several molecular property prediction benchmarks.

Installation

  1. Clone this repository:
git clone https://github.com/JaanuNan/DropGNN.git
cd dropgnn.ipynb
  1. Create and activate a virtual environment (recommended):
python -m venv venv
source venv/bin/activate  # On Windows use `venv\Scripts\activate`

Usage

Jupyter Notebook

The main implementation is provided in DropGNN.ipynb. Open it with Jupyter:

jupyter notebook dropgnn.ipynb

Python Script

Alternatively, you can use the standalone Python implementation:

python dropgnn.py

Configuration

Modify the hyperparameters in the notebook/script to experiment with different settings:

  • Dropout rate
  • Number of GNN layers
  • Hidden layer dimensions
  • Learning rate
  • Training epochs

Repository Structure

dropgnn-implementation/
β”œβ”€β”€ dropgnn.ipynb    # Main implementation notebook
β”œβ”€β”€ dropgnn.py                     # Python implementation
β”œβ”€β”€ documentation              # Implementation documentation
β”œβ”€β”€ Research paper                # Paper Pdf
└── LICENSE

Results

image

Contributing

Contributions are welcome! Please open an issue or submit a pull request for any improvements.

Citation

If you use this implementation in your research, please cite the original paper:

@article{papp2021dropgnn,
  title={DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks},
  author={P{\'a}l Andras Papp and Roger Wattenhofer},
  journal={arXiv preprint arXiv:2111.06283},
  year={2021}
}

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

πŸ”¬ DropGNN Implementation | Official PyTorch implementation of "DropGNN: Random Dropouts Increase the Expressiveness of GNNs". Enhances Graph Neural Networks with node dropout for improved drug discovery tasks.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors