Skip to content

clisazo/transformer_longitudinal_brain_age

Repository files navigation

Enhancing Brain Age Estimation with a Transformer Add-on Incorporating a Longitudinal Consistency Prior

Getting Started

1. Data Preparation

  • Prepare your CSV files with paths to input feature maps (extracted with pretrained encoders) and metadata (see example in data/dataset.csv).
  • Update paths and parameters in config.py and config_inference.py as needed.
  • Update paths in bias_correction/coles_bias_correction.py

2. Training

Run the main training script:

python train.py
  • Supports both cross-sectional and longitudinal training (set in config).

3. Inference

Run inference on validation, test, or external datasets:

python inference/inference_main.py
  • Results and metrics are saved to the experiment folder.
  • Plots of predicted vs. real ages are generated.

4. Bias Correction

Apply post-hoc bias correction to predicted ages:

python bias_correction/coles_bias_correction.py
  • Uses Huber regression to correct for age bias (Coles et al.).
  • Saves corrected predictions and updated metrics.

Key Modules

  • model_module.py: Defines model architectures and training logic.
  • data_module.py: Handles data loading, batching, and custom sampling for longitudinal pairs.
  • utils.py: Preprocessing, normalization, augmentation, and helper functions.
  • train.py: Orchestrates training, logging, and checkpointing.
  • inference/inference_main.py: Loads trained models and runs inference.
  • bias_correction/coles_bias_correction.py: Applies bias correction to predictions.

Citation

If you use this codebase in your research, please cite the article 'Enhancing Brain Age Estimation with a Transformer Add-on Incorporating a Longitudinal Consistency Prior'.

License

This project is released under the MIT License.


Contact:
For questions or contributions, please open an issue or contact me at clara.lisazo@udg.edu.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages