Carlos Ruiz Gonzalez, Sören Arlt, Sebastian Lehner, Arturs Berzins, Yehonathan Drori, Rana X Adhikari, Johannes Brandstetter, Mario Krenn
The code from this repository implements the algorithm SPROUT and, together with the models and datasets from zenodo, provides the results shown in our paper.
SPROUT uses an active learning strategy to train surrogate models for the inverse design of gravitational wave detectors (GWD). It follows these steps:
- Produce valid GWD by choosing random design parameters for UIFO, an overparametrized ansatz with a fixed topology.
- Train surrogate models to predict the GWD properties (sensitivity + optical powers) from the design parameters.
- Inverse design high sensitivity GWD that meet the hardware constraints: optical powers below certain thresholds.
- Use the CPU-based simulator Finesse to verify some of the inverse design samples (a few per optimization process).
- Back to step 2. Use the new samples, together with the previous dataset, to retrain the models.
In the paper, we started with 1M random samples. Then we optimized 200K samples, randomly initialized, and verified the samples at 5 steps of each optimization procedure. 1M designs were added to the dataset at every round (+10% for testing).
The steps of the SPROUT algorithm are divided in 4 python scripts:
1_create.py: generates randomly parametrized GWD for training using the previous grid scripts. Formerlly calledGWD1create.py.2_train.py: (re)trains the surrogate models with the available datasets. Merge ofGWD2train.pyandGWD2tune.py.3_design.py: optimizes randomly parametrized GWD based on the outputs of the surrogate models. Formerlly calledGWD3design.py.4_verify.py: verifies some of the inverse designs from the surrogate models using the CPU-based software Finesse. Formerlly calledGWD4verify.py.
Includes the modules used by the main scripts.
blocks.py: building blocks to design a grid-like GWD (a UIFO template) using the software Finesse. Formerly calledGWDblocks.py.datasets.py: normalize and formats the (input) paramaters from GWD and their (output) properties for model training. Simplified fromGWDnets.py.networks.py: building blocks for the surrogates models. It also groups the parameters from the GWD into local overlapping patches. Simplified fromGWDmixer.py.targets.py: translates the normalized output parameters from the models produce into meaningful sensitivity values and optical powers. Using these values we can compute the quality of the GWD and check if they meet the hardware constraints. Formerly calledGWDfuns.py.gridNxN.py(for N$\in$[1,2,3]): similar scripts to build grid-like GWD of 3 different sizes, with increasing number of elements and parameters. They are renamed versions ofGWD0unit.py,GWD0plaza.py, andGWD0stadium.py.
Stores the GWD designs. Download them in zenodo.
Stores the surrogate models. Download them in zenodo.
A place to store plots and images.
For transparency, we store the original scripts that we used to train the models and generate inverse designs. They include legacy code, not needed anymore, and additional tools to exploit the hardware we had access to.
The time to install all requirements should be less than five minutes.
-
Clone the repository:
git clone https://github.com/artificial-scientist-lab/SPROUT.git cd SPROUT -
Install the packages specified by
requirements.txtpip install -r requirements.txt
-
Download the models and datasets from Zenodo.
- For data generation multiple processes running in parallel on CPUs were used.
- For training we used data parallelism on four A100-40GB GPUs, but single GPUs can be also used if gradient accumulation is used to achieve the desired batch size.
- For sampling we used single consumer grade GPUs. CPUs will be slower, but the models are small enough to sample with reasonable speed
If you use this repository in your work, please cite:
@article{ruiz2025neural,
title={Neural surrogates for designing gravitational wave detectors},
author={Ruiz-Gonzalez, Carlos and Arlt, S{\"o}ren and Lehner, Sebastian and Berzins, Arturs and Drori, Yehonathan and Adhikari, Rana X and Brandstetter, Johannes and Krenn, Mario},
journal={arXiv preprint arXiv:2511.19364},
doi={https://doi.org/10.48550/arXiv.2511.19364},
year={2025}
}
