Skip to content

artificial-scientist-lab/SPROUT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🌱 SPROUT: Surrogate Predictions for Reiterative Optimization with Update Training

Neural surrogates for designing gravitational wave detectors

Carlos Ruiz Gonzalez, Sören Arlt, Sebastian Lehner, Arturs Berzins, Yehonathan Drori, Rana X Adhikari, Johannes Brandstetter, Mario Krenn

The code from this repository implements the algorithm SPROUT and, together with the models and datasets from zenodo, provides the results shown in our paper.

Overview

SPROUT uses an active learning strategy to train surrogate models for the inverse design of gravitational wave detectors (GWD). It follows these steps:

  1. Produce valid GWD by choosing random design parameters for UIFO, an overparametrized ansatz with a fixed topology.
  2. Train surrogate models to predict the GWD properties (sensitivity + optical powers) from the design parameters.
  3. Inverse design high sensitivity GWD that meet the hardware constraints: optical powers below certain thresholds.
  4. Use the CPU-based simulator Finesse to verify some of the inverse design samples (a few per optimization process).
  5. Back to step 2. Use the new samples, together with the previous dataset, to retrain the models.

In the paper, we started with 1M random samples. Then we optimized 200K samples, randomly initialized, and verified the samples at 5 steps of each optimization procedure. 1M designs were added to the dataset at every round (+10% for testing).

image info

Repository structure

The steps of the SPROUT algorithm are divided in 4 python scripts:

  • 1_create.py: generates randomly parametrized GWD for training using the previous grid scripts. Formerlly called GWD1create.py.
  • 2_train.py: (re)trains the surrogate models with the available datasets. Merge of GWD2train.py and GWD2tune.py.
  • 3_design.py: optimizes randomly parametrized GWD based on the outputs of the surrogate models. Formerlly called GWD3design.py.
  • 4_verify.py: verifies some of the inverse designs from the surrogate models using the CPU-based software Finesse. Formerlly called GWD4verify.py.

tools

Includes the modules used by the main scripts.

  • blocks.py: building blocks to design a grid-like GWD (a UIFO template) using the software Finesse. Formerly called GWDblocks.py.
  • datasets.py: normalize and formats the (input) paramaters from GWD and their (output) properties for model training. Simplified from GWDnets.py.
  • networks.py: building blocks for the surrogates models. It also groups the parameters from the GWD into local overlapping patches. Simplified from GWDmixer.py.
  • targets.py: translates the normalized output parameters from the models produce into meaningful sensitivity values and optical powers. Using these values we can compute the quality of the GWD and check if they meet the hardware constraints. Formerly called GWDfuns.py.
  • gridNxN.py (for N$\in$[1,2,3]): similar scripts to build grid-like GWD of 3 different sizes, with increasing number of elements and parameters. They are renamed versions of GWD0unit.py, GWD0plaza.py, and GWD0stadium.py.

datasets

Stores the GWD designs. Download them in zenodo.

models

Stores the surrogate models. Download them in zenodo.

plots

A place to store plots and images.

originals

For transparency, we store the original scripts that we used to train the models and generate inverse designs. They include legacy code, not needed anymore, and additional tools to exploit the hardware we had access to.

System requirements & installation

Software

The time to install all requirements should be less than five minutes.

  1. Clone the repository:

    git clone https://github.com/artificial-scientist-lab/SPROUT.git
    cd SPROUT
  2. Install the packages specified by requirements.txt

    pip install -r requirements.txt
  3. Download the models and datasets from Zenodo.

Hardware

  • For data generation multiple processes running in parallel on CPUs were used.
  • For training we used data parallelism on four A100-40GB GPUs, but single GPUs can be also used if gradient accumulation is used to achieve the desired batch size.
  • For sampling we used single consumer grade GPUs. CPUs will be slower, but the models are small enough to sample with reasonable speed

Citation

If you use this repository in your work, please cite:

@article{ruiz2025neural,
  title={Neural surrogates for designing gravitational wave detectors},
  author={Ruiz-Gonzalez, Carlos and Arlt, S{\"o}ren and Lehner, Sebastian and Berzins, Arturs and Drori, Yehonathan and Adhikari, Rana X and Brandstetter, Johannes and Krenn, Mario},
  journal={arXiv preprint arXiv:2511.19364},
  doi={https://doi.org/10.48550/arXiv.2511.19364},
  year={2025}
}

Related links

About

Surrogate Predictions for Reiterative Optimization with Update Training (SPROUT)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages