Initial_GNN_Fingers_Implementation #18
Open
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
📋 Summary
This PR introduces significant enhancements to the GNNFingers defense framework, implementing discrete adjacency matrix optimization and clipped node feature integration while establishing a foundation for multi-graph type support. Key improvements include:
Discrete Adjacency Optimization: Top-k gradient-based edge selection with straight-through estimator
Clipped Node Feature Integration: Feature value constraints based on original data distribution
Alternating Joint Learning: Improved optimization strategy for fingerprint and univerifier co-training
Architecture Foundation: Flexible structure supporting future extension to multiple graph types
🧪 Related Issues
Implements discrete optimization strategies from GNNFingers paper
Addresses feature value stability through clipping mechanisms
Enhances training stability with alternating optimization
✅ Checklist
My code follows the project's coding style
The PR is made from a feature branch, not main
🧠 Additional Context (Optional)
Discrete Adjacency Updates (_update_adjacency_discrete):
- Top-k gradient-based edge selection (configurable via top_k_ratio)
- Straight-through estimator for differentiable discrete operations
- Bidirectional edge flipping based on gradient signs
Clipped Feature Integration:
-Dynamic value clamping based on original feature distribution
-Configurable bounds for synthetic feature generation
-Gradient-aware feature updates
Alternating Optimization (_joint_learning_alternating):
-Separate update cycles for fingerprints and univerifier
-Configurable epoch ratios for each component
-Loss-based convergence monitoring