An unofficial implementation of the ICML 2021 paper: Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design by Foster et al.1
config/: hydra configuration filesdata/: simulate experimentsloss/: loss functionsmodel/: design policy networksutils/: helper functions
- Python ≥ 3.10
- PyTorch ≥ 2.0
- Additional dependencies are listed in requirements.txt
Create a new virtual environment:
# create a new conda environment
conda create -n dad python=3.12 -y
# activate the environment
conda activate dadInstall PyTorch (adjust based on your OS and CUDA version, see: https://pytorch.org/get-started/locally/):
pip install torch==2.6.0 --index-url https://download.pytorch.org/whl/cu126Install remaining dependencies:
pip install -r requirements.txtTo reproduce the location-finding experiment with default settings:
python location_finding.pyTo run with custom hyperparameters:
python location_finding.py data.K=1 data.theta_dist=uniformTo enable logging and monitor training in real time, add the following flag:
wandb.use_wandb=True
Make sure you are logged into your Weights & Biases account.
This repository aims to reproduce the main experimental results from the paper with simplified dependencies and a modular structure. If you find it useful, please feel free to build on it. Contributions are warmly welcome and appreciated :)
For the official implementation, please refer to dad.
Footnotes
-
Foster, Adam, Desi R. Ivanova, Ilyas Malik, and Tom Rainforth. ‘Deep Adaptive Design: Amortizing Sequential Bayesian Experimental Design’. In Proceedings of the 38th International Conference on Machine Learning, 3384–95. PMLR, 2021. https://proceedings.mlr.press/v139/foster21a. ↩