Skip to content

YanivDorGalron/SLPE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SLPE

This repository is the official PyTorch implementation of "Understanding and Improving Laplacian Positional Encodings For Temporal GNNs", featuring temporal graph neural networks with trajectory-based supra-Laplacian positional encoding.

📄 Paper: Understanding and Improving Laplacian Positional Encodings For Temporal GNNs (ECML-PKDD 2025)

Overview

SLPE introduces novel positional encoding techniques for temporal graphs by leveraging trajectory information in supra-adjacency matrices. The framework extends traditional graph neural networks to handle dynamic graphs more effectively by incorporating temporal structural information through eigenvector trajectories.

Key Features

  • Temporal Positional Encoding: Novel approach using supra-Laplacian eigendecomposition with trajectory information
  • Multiple Graph Representations: Support for Regular, Concatenated, and Supra graph representations
  • Flexible Model Architecture: Integration with HTGN, SLATE, and other temporal GNN models
  • Comprehensive Dataset Support: Compatible with TGBL, DyGLib, and HTGN datasets
  • GPU Acceleration: CUDA support with CuPy for efficient eigendecomposition

Installation

Environment Setup

Create a conda environment using the provided configuration:

conda env create -f environment.yml
conda activate env

Usage

Quick Start

Run the basic training script:

python src/train.py

Configuration

The framework supports extensive configuration through command-line arguments. Key parameters include:

Dataset Options

  • --dataset: Choose from TGBL datasets (tgbl-wiki, tgbl-review, tgbl-coin, etc.), DyGLib datasets (CanParl, USLegis, Flights, etc.), or HTGN datasets (enron10, dblp, as733, etc.)

Model Architecture

  • --model: Model type (default: HTGN)
  • --nhid: Hidden embedding dimension (default: 16)
  • --nfeat: Input feature dimension (default: 128)

Temporal Positional Encoding

  • --use_pes: Enable positional encoding features
  • --pes_kind: Type of PE (Laplacian, Random, Sinusoidal)
  • --graph_representation: Graph representation method (Regular, Concatenated, Supra)
  • --length_of_pe: Dimensionality of positional encoding (default: 8)
  • --pes_window_size: Temporal context window size (default: 3)
  • --use_traj: Enable trajectory-based eigenvector computation
  • --circular: Use circular temporal connections

Training Parameters

  • --max_epoch: Maximum training epochs (default: 500)
  • --lr: Learning rate (default: 0.01)
  • --patience: Early stopping patience (default: 50)
  • --repeat: Number of experimental runs (default: 1)

Example Commands

Train HTGN with supra-Laplacian positional encoding:

python src/train.py --model=HTGN --dataset=enron10 --use_pes=True --pes_kind=Laplacian --graph_representation=Supra --use_traj=True

Train SLATE model on TGBL dataset:

python src/train.py --model=SLATE --dataset=tgbl-wiki --use_pes=True --length_of_pe=16 --pes_window_size=5

Supported Datasets

TGBL Datasets

  • tgbl-wiki, tgbl-review, tgbl-coin, tgbl-flight, tgbl-comment

DyGLib Datasets

  • CanParl, USLegis, Flights, UNtrade, UNvote, Contacts

HTGN Datasets

  • enron10, dblp, as733, fbw, HepPh30

Architecture

Core Components

  1. TemporalPE: Handles temporal positional encoding computation
  2. EigenSolver: Manages eigendecomposition with multiple solver options
  3. Runner: Orchestrates training and evaluation procedures
  4. Model Implementations: HTGN, SLATE, and other temporal GNN variants

Graph Representation Methods

  • Regular: Standard snapshot-based processing
  • Concatenated: Feature concatenation across time windows
  • Supra: Supra-adjacency matrix construction with inter-layer connections

Experiment Tracking

The framework integrates with Weights & Biases for comprehensive experiment tracking. Results include:

  • AUC and AP scores across multiple runs
  • Training curves and performance metrics
  • Hyperparameter configurations
  • Computational profiling information

Contributing

This project builds upon several existing frameworks:

  • HTGN - Hyperbolic Temporal Graph Networks

Citation

If you use this code in your research, please cite:

@article{galron2025understanding,
  title={Understanding and Improving Laplacian Positional Encodings For Temporal GNNs},
  author={Galron, Yaniv and Frasca, Fabrizio and Maron, Haggai and Treister, Eran and Eliasof, Moshe},
  journal={arXiv preprint arXiv:2506.01596},
  year={2025}
}

License

Please refer to the individual component licenses and the original HTGN repository for licensing information.

About

The official PyTorch implementation of "Understanding and Improving Laplacian Positional Encodings For Temporal GNNs" (ECML-PKDD 2025)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors