Skip to content

raipalorange/ICGRNN

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Input Convex Graph Recurrent Neural Networks (ICGRNN)

An implementation of Graph Neural Networks with input convexity constraints, applied to heat diffusion control on graphs.

What This Is

Standard GNNs are black boxes with no optimization guarantees. This project enforces input convexity — all network weights are constrained to be non-negative — so the learned function satisfies:

f(αx₁ + (1-α)x₂) ≤ αf(x₁) + (1-α)f(x₂)

This gives you global optimization guarantees and interpretable gradients, at the cost of some expressiveness. The architecture combines this constraint with graph message passing (spatial) and recurrent updates (temporal).

Architecture

Three components, each maintaining convexity:

  • Convex Layers — Linear layers with relu(weight) to enforce non-negative weights
  • Graph Message Passing — Degree-normalized aggregation with convexity-preserving constraints and skip connections
  • Recurrent Updates — Temporal state evolution across multiple time steps

Results

Heat Diffusion Control (main contribution)

Modeling controlled heat diffusion on graphs over time — predicting both temperature evolution and optimal control inputs.

Task MSE
Temperature trajectory prediction 0.0072
Control input prediction 0.0001

The control input prediction is where this architecture shines — convexity constraints are a natural fit for control optimization problems.

Heat Diffusion Results

Node Classification (baseline comparison)

Dataset ICGCN Standard GCN
Cora 0.78 0.81
PubMed 0.73 0.79

The convexity constraint reduces accuracy by 3-6 points vs unconstrained GCN. This is the expected tradeoff — you lose expressiveness but gain guaranteed convexity, which matters for downstream optimization tasks (like the heat diffusion control above).

Project Structure

icgrnn/
├── layers/           # Convex layer implementations
├── models/           # ICGRNN model architectures
├── message_passing/  # Convexity-preserving graph convolution
├── experiments/      # Training scripts per dataset
└── training/         # Training utilities (gradient clipping, NaN detection)

Usage

pip install -r requirements.txt

# Heat diffusion control (main experiment)
python icgrnn/experiments/heat_diffusion.py

# Node classification baselines
python icgrnn/experiments/cora.py
python icgrnn/experiments/pubmed.py
from models import ICGRNN

model = ICGRNN(
    input_dim=2,
    hidden_dim=64,
    output_dim=1,
    icnn_hidden_dims=[32, 32]
)

output, hidden = model(x, edge_index, steps=20)

Why Convexity Matters

Most GNN applications don't need convexity — and for pure classification, unconstrained models will outperform this one. The value is in problems where you need optimization guarantees on top of graph learning: control systems, physics-informed modeling, or any setting where you want to solve argmin f(x) over the learned function and need that minimum to be global.

About

Implementation of Input Convex Graph Recurrent Neural Networks with novel heat diffusion applications. Combines convex optimization principles with graph neural networks for guaranteed optimization properties and interpretable predictions.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Jupyter Notebook 95.7%
  • Python 4.3%