A deep learning framework for reconstructing fluid flow fields from sparse measurements using Graph Transformer (and GNN) architectures. FRGT combines the strengths of graph-based representations with attention mechanisms to predict complete flow fields from sparse observations.
FRGT (Flow Reconstructing Graph Transformer) is designed to reconstruct complete fluid flow fields from sparse sensor measurements on 2D airfoils. The model leverages:
- Graph Neural Networks for handling irregular mesh geometries
- Transformer attention for long-range dependencies
- Multi-processor architectures (Reversible GNN, Hybrid Graph Transformer)
- β‘ Efficient reconstruction from sparse pressure measurements
- π Multiple processor types: HGT, Interleaved HGT or Reversible GNN
- π Optional physics-aware training using the divergence loss
- π― Configurable coverage for airfoil measurements (0-100%)
FRGT/
βββ README.md # Project documentation
βββ LICENSE.md # MIT License
βββ requirements.txt # Python dependencies
βββ train.py # Main training script
βββ gt_archi.png # Architecture diagram
βββ configs/ # Configuration files
β βββ config_frgt.yml # Standard FRGT config
β βββ config_frgt_it.yml # Interleaved transformer config
β βββ config_revGAT.yml # Reversible GNN config
βββ models/ # Model architectures
β βββ __init__.py
β βββ frgt.py # Main FRGT model
β βββ encoder.py # Feature encoder
β βββ decoder.py # Output decoder
β βββ hgtprocessor.py # Hybrid Graph Transformer
β βββ revprocessor.py # Reversible GNN processor
β βββ feature_propagation.py # Feature propagation utilities
β βββ building_blocks.py # Neural network components
βββ datatools/ # Data handling utilities
β βββ __init__.py
β βββ dataset.py # CFD dataset loader
β βββ parse_mesh.py # Mesh parsing utilities
β βββ process_simulations.py # Simulation preprocessing
β βββ compute_dataset_stats.py # Dataset statistics
βββ utils/ # Utility functions
β βββ __init__.py
β βββ single_model_plotter.py # Visualization tools
βββ notebooks/ # Jupyter notebooks
β βββ results_viz.ipynb # Results visualization
βββ runs/ # Training outputs
βββ proc_*/ # Individual training runs
βββ config.yml # Run configuration
βββ trained_models/ # Model checkpoints
βββ best.pt # Best validation model
βββ e*.pt # Epoch checkpoints
- Python 3.8 or higher
- CUDA-compatible GPU (recommended)
- 16GB+ RAM for large datasets
-
Clone the repository:
git clone https://github.com/gduthe/FRGT.git cd FRGT -
Install dependencies:
pip install -r requirements.txt
-
Verify installation:
python -c "import torch; import torch_geometric; print('Installation successful!')"
FRGT expects CFD simulation data in a specific format:
- Input: ZIP file containing PyTorch geometric graphs (.pt files)
- Node features: Pressure, velocities (u,v), signed distance function (SDF)
- Edge features: Relative distances, face surfaces
- Global features: Estimated free-stream velocity
Each graph contains:
x: Node features [pressure, sdf] (input)y: Target features [pressure, u_velocity, v_velocity]edge_index: Graph connectivityedge_attr: Edge features [dx, dy, face_surface]pos: Node coordinatesnode_type: Node classification (0=fluid, 1=boundary)
For this research, CFD simulations were generated using OpenFOAM with:
- Geometry: airfoils from the UIUC dataset with varying angles of attack
- Solver: Incompressible, steady-state k-omega SST
- Mesh: Unstructured triangular meshes
- Boundary conditions: Far-field velocity inlet, pressure outlet
The training dataset can be obtained from Zenodo. If you have OpenFOAM simulations, you can use the preprocessing pipeline in datatools/ to create custom datasets.
-
Prepare your dataset and update paths in the config file:
io_settings: train_dataset_path: 'path/to/train_dataset.zip' valid_dataset_path: 'path/to/valid_dataset.zip'
-
Run training:
python train.py --config configs/config_frgt.yml
- HGT:
processor_type: 'hgt'- Hybrid Graph Transformer - Reversible GNN:
processor_type: 'rev'- Memory-efficient reversible layers - Interleaved HGT:
processor_type: 'hgt_it'- Interleaved attention layers
hyperparameters:
batch_size: 1 # Graphs per batch
epochs: 500 # Training epochs
start_lr: 5e-4 # Initial learning rate
airfoil_coverage: 1.0 # Sensor coverage (0.0-1.0)
model_settings:
latent_dim: 160 # Hidden dimension
fp_steps: 30 # Feature propagation steps
processor_type: 'hgt' # Architecture type# Train with 70% airfoil sensor coverage
python train.py --config configs/config_frgt.yml
# Modify airfoil_coverage: 0.7 in confighyperparameters:
div_loss_factor: 0.1 # Enable divergence lossmodel_settings:
noise_sigma: 0.01 # Add noise during trainingTraining produces:
- Model checkpoints:
runs/proc_*/trained_models/ - Best model:
best.pt(lowest validation loss) - Configuration:
config.yml(run parameters) - Periodic saves:
e*.pt(every N epochs)
Use the provided visualization notebook:
jupyter notebook notebooks/results_viz.ipynbOr implement custom evaluation:
import torch
from models import FRGT
from datatools import CFDGraphsDataset
# Load trained model
model = FRGT(**config)
checkpoint = torch.load('runs/proc_*/trained_models/best.pt')
model.load_state_dict(checkpoint['model_state_dict'])
# Evaluate on test data
test_dataset = CFDGraphsDataset('path/to/test_dataset.zip')
# ... evaluation codeIf you use this code in your research, please cite:
@article{duthe2025graph,
title={Graph Transformers for inverse physics: reconstructing flows around arbitrary 2D airfoils},
author={Duth{\'e}, Gregory and Abdallah, Imad and Chatzi, Eleni},
journal={arXiv preprint arXiv:2501.17081},
year={2025}
}
This project is licensed under the MIT License - see the LICENSE.md file for details.
Contributions are welcome! Please feel free to submit a Pull Request.
