Skip to content

Rufaim/routing-by-memory

Repository files navigation

Knowledge Distillation to Mixture of Experts

This repository is for Graph Knowledge Distillation to Mixture of Experts project. It contains code for knowledge distillation from GNN to MLP, MoE and RbM.

Preparing datasets

To run experiments for dataset used in the paper, please download from the following links and put them under data/ (see below for instructions on organizing the datasets).

  • DGL data (cora, citeseer, pubmed) are automatically downloaded.

  • CPF data (a-computer, and a-photo): Download the '.npz' files from here.

  • OGB data (ogbn-arxiv and ogbn-products): Datasets will be automatically downloaded when running the load_data function in dataloader.py. More details here.

How to run distillation to MLP

All the code was tested on Python 3.8.13

python main_distill.py -d <DATASET_TYPE> -t <TEACHER_TYPE> -m <RUN_MODE> -s <STUDENT_TYPE>  --config <PATH_TO_CONFIG> [--reliable_sampling] [--positional_encoding] [--similarity_distill] [--adv_augment] [--label_propagation] [--gpu_id <GPU_ID>] [--seed <SEED>] [--batch_size <SIZE>]

<DATASET_TYPE> can be cora, citeseer, pubmed, amazon-com, amazon-photo, academic-cs, academic-physics, ogbn-arxiv or ogbn-products.
<TEACHER_TYPE> is either gcn or sage.
<RUN_MODE> is either inductive or transductive.
<STUDENT_TYPE> is one of mlp, moe or rbm.
<PATH_TO_CONFIG> is one of the run configs (see config fonder).
<GPU_ID> is an id of a gpu. If negative, will run on cpu.
fixes the seed of a random generator. If negative, will run with a rnadom seed.

Our setput allows to emulate main baseline configurations

To run NOSMOG configuration use:

python main_distill.py -d <DATASET_TYPE> -m <RUN_MODE> --config <PATH_TO_CONFIG> -t sage -s mlp --positional_encoding --similarity_distill --adv_augment --batch_size 4096 [--gpu_id <GPU_ID>] [--seed <SEED>]

To run KRD configuration use:

python main_distill.py -d <DATASET_TYPE> -m <RUN_MODE> --config <PATH_TO_CONFIG> -t sage -s mlp --reliable_sampling [--gpu_id <GPU_ID>] [--seed <SEED>]

Citation

If you find this project useful for your research, please use the following BibTeX entry.

@article{rumiantsev2024graph,
  title={Graph Knowledge Distillation to Mixture of Experts},
  author={Pavel Rumiantsev and Mark Coates},
  journal={Transactions on Machine Learning Research},
  issn={2835-8856},
  year={2024},
  url={https://openreview.net/forum?id=vzZ3pbNRvh}
}

Acknowledgements

  1. NOSMOG: Learning Noise-robust and Structure-aware MLPs on Graphs (ArXiv, Code)
  2. Quantifying the Knowledge in GNNs for Reliable Distillation into MLPs (ArXiv, Code)
  3. Classifying Nodes in Graphs without GNNs (ArXiv, Code)

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published