Skip to content

Commit 481d4d0

Browse files
authored
Graph Neural Network Architectures (#2)
* graph attention architecture is completed * graph attention network architecture is cleaned up * GNN architecture that combines local and global patterns * hierarchical graph architecture is completed * cleaned up the hierarchical class for linting pass * intial sketch of graph tansformer architectures, it is unmasked * tranformer GNN class is completed first draft * typing annotation upgraded for python version +3.10 * multiscale graph neural network architecture is sketched out * methods of upsamling and coarsening graph attribute (edge and node) are added * aggregation of attention and weighted conditions added as a separate method * multi scale gnn architecture is completed * multiscale GNN architecture is cleaned up * gnn models converted to the private files * creating private file per GNN model * docstring for init file is completed * linting docstring errors are checked * docstring error checked out * trailing white space removed * linting error of white trailing is fixed * graph attention network renamed * heterogeneous GNN, heterodata import and architecture for attention mechanism is initialized * the module initializer is updated with the heteroconv class import * simplified version of HGNN without attention wieghts * method for retrieving attention weights completed * removed the older transfomer format for gnn architecture * module initalizer is updated with a new transfomer class * the heterogenous implementation of transformer graph neural network is completed * TODO flag added for checking the attention weights method * the weight attention retrieval rewritten and linting errors are fixed * linting errors are fixed * fix linting errors * fixing linting errors * linting errors addressed * linting errors are addressed * linting errors are fixed * the homogenous GNN architectures are removed from the GNN module * initializer file is updated with the existence classes * tests for the verification of HGAT class import and its methods * starting tests for the GNN architectures * minimal test of heterogeneous graph attention architecture is added * test added for the forward method * test added for the attention weights method * debugging to return weights from GATConv layer for edge types * graph attention reimplemented to check for intermediate layers and last layer heads * test file are written to check for the graph attention architecture * linting length lines are checked * unused import of BatchNorm is removed * the import line length is checked * the linting line length for importing HGAT class is shortened * import classes from gnn module to src completed * fixed the import module of gnn attention graph class in tests * attribute of edge dimension added to the convolution layer * all TODO flags are addressed * typing imports have been updated for the latest Python versions * test functions for the graph transformer PyG started * function added for testing the minimal dummy trasnformer graph * test function added for the forward pass method * test function added for attention weight extractions of nodes and edges * the aggregation attribute for the initializer updated to sum * test forward pass method fixed with the edge attributes features to pass * test method for the attention weights is unindentied * test is upgraded with the edge attribute weights * attention weight method can take an optional attribute of the edge attribute * transfomrmer debugging is completed, edge prjection functions correctly * get attention arguments are corrected * forward and get attention method are updated for a new edge projection implementation * tests are completed and passed successfully * linting check are addressed for shorter lines * linting error checked and addressed * linting lines lenght are checked and reduced
1 parent e17e699 commit 481d4d0

7 files changed

Lines changed: 1201 additions & 1 deletion

File tree

src/scheduling_rlgnn/__init__.py

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,4 +8,12 @@
88
__email__ = "amir.navid.rahimi@googlemail.com"
99

1010
from .rllib_subclasses import *
11-
from .gnn_models import *
11+
from .gnn_models import (
12+
HeterogeneousGraphAttentionNetwork,
13+
HeteroGraphTransformer,
14+
)
15+
16+
__all__ = [
17+
"HeterogeneousGraphAttentionNetwork",
18+
"HeteroGraphTransformer",
19+
]
Lines changed: 27 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1 +1,28 @@
11
"""GNN model architectures."""
2+
3+
from ._heterogeneous_graph_attention_network import (
4+
HeterogeneousGraphAttentionNetwork,
5+
)
6+
from ._hetero_graph_transformer import HeteroGraphTransformer
7+
8+
"""
9+
This package provides various Graph Neural Network (GNN) model
10+
architectures for use in graph-based learning tasks.
11+
12+
Available architectures:
13+
- GraphAttentionNetwork: Implements a GNN using attention
14+
mechanisms to weigh node neighbors.
15+
- GraphTransformer: A transformer-based model adapted for
16+
graph-structured data.
17+
- HierarchicalGNN: A hierarchical GNN that captures
18+
multi-level graph representations.
19+
- MultiScaleGNN: A GNN architecture designed to process
20+
information at multiple graph scales.
21+
22+
These models can be imported directly from this package for
23+
use in graph learning pipelines.
24+
"""
25+
__all__ = [
26+
"HeterogeneousGraphAttentionNetwork",
27+
"HeteroGraphTransformer",
28+
]

0 commit comments

Comments
 (0)