-
Notifications
You must be signed in to change notification settings - Fork 2
Description
Issue Description:
There is currently a mismatch between the intended and actual usage of input and hidden channels in the ResGatedGraphConvNetBase class implementation.
🧩 Problem Description
-
Why are input channels needed if we have
n_atom_properties?
In our current setup,n_atom_propertiesis effectively acting as the number of input channels for the first convolution layer. This raises the question of whether we need an explicitin_lengthparameter at all if it’s not used directly. The only place where it appears meaningful is in the code, where it is incorrectly applied to multiple convolution layers. -
Incorrect channel assignment in previous version:
In current implementation, all convolution layers usedin_lengthas the input channel size, and only the final layer transitioned tohidden_length. This does not follow conventional practice as used in PyG’sBasicGNN, the correct pattern should be:- First layer:
input_channels(n_atom_propertiesin our case) →hidden_channels - Hidden layers:
hidden_channels → hidden_channels - (Optionally) Final layer:
hidden_channels → output_channelsor similar
- First layer:
-
Off-by-one error in number of convolution layers:
If the user specifiesn_conv_layers=3, the previous code creates 4 layers due to an extra one outside the loop (n + 1instead ofn). This is misleading and could result in unintended architectural depth.
✅ Proposed Fix
- Use
n_atom_propertiesas the number of input channels. - Remove
in_lengthif it's redundant. - Ensure the first convolution layer maps from
n_atom_properties → hidden_length, and all subsequent layers usehidden_length → hidden_length. - Fix layer creation logic to respect
n_conv_layersaccurately (i.e., create exactlynlayers).