- Go into the directory of your project with
cd project_folder_path - Create an empty virtual environment with
py -m venv .\my_env_name - Enter into the virtual environment with
my_env_name\scripts\activate - Check that the environment is empty with
pip freeze; normally, it should print nothing - Install the required packages from the
.txtfilerequirements.txtwithpip install -r requirements.txt - Run again
pip freezeand check that the environment is no longer empty - Add the environment folder to your
.gitignore(in order to avoid pushing the packages on git!)
To exit from the virtual environment, use deactivate
- Go into the directory of your project with
cd project_folder_path - Create an empty virtual environment with
virtualenv .\my_env_name - Enter into the virtual environment with
source my_env_name\bin\activate - Check that the environment is empty with
pip freeze; normally, it should print nothing - Install the required packages from the
.txtfilerequirements.txtwithpip install -r requirements.txt - Run again
pip freezeand check that the environment is no longer empty - Add the environment folder to your
.gitignore(in order to avoid pushing the packages on git!)
To exit from the virtual environment, use deactivate
- 📁
configcontains.jsonfiles which encode the options and parameter choices for the test cases proposed. - 📁
datacontains the dataset for each test case. In each subfolder, you can find.npyfiles storing inputs, solution and parametric field values. They are stored separately, according to the category of data they belong to (such as fitting data, collocation data...).
In particular, we have:- Boundary data:
dom_bnd.npy,sol_bnd.npyfor the coordinates and solution values of boundaries; - Collocation data:
dom_pde.npyfor the coordinates of collocation points; - Fitting data:
dom_sol.npy,sol_train.npyanddom_par.npy,par_train.npyfor coordinate and values couples for solution and/or parametric field; - Test data:
dom_test.npy,sol_test.npy,par_test.npyfor coordinate and function values of test points.
- Boundary data:
- 📁
outscontains the results for each test case. In each subfolder, you can find the folders:logwith loss history and summary of the experiment options and errors in.txtfilesplotwith the plotsthetaswith network parametersvalueswith the solution computed by the network
- 📁
srccontains the source code, described in detail in the section below.
main.pyis the executable script, relying on all the below modules.main_data.pyis a script that can be run independently from the main and it generates a new data subfolder for each new test case.main_loader.pyis a script that can be run independently from the main and it reloads a test case already present inouts.- 📁
setupis a module containing:- the class to parse command line arguments (in
args.py) - the class to set parameters (in
param.py), reading them both from the configuration files and from command line data_creation.pycontains the class for dataset creation starting from raw data stored in the folderdatadata_generator.pycontains the class to generate raw datadata_loader.pydefines the data loader class (now not in use)
- the class to parse command line arguments (in
- 📁
utilitycontains technical auxiliary tasks, in particular:switcher.pyswitches to the test case under analysis among the files contained in the three folders belowdirectories.pygenerates directories for data generation and results storagemiscellaneous.pycontains utility functions for all other purposes
- 📁
algorithmsis a module containing classes representing the training algorithms proposed in this project:ADAM: Adaptive Moment EstimationHMC: Hamiltionian Monte CarloSVGD: Stein Variational Gradient DescentVI: Variational Inference
The folder includes as well the classTrainer, used to manage the pre-training and training algorithm.
- 📁
datasetscontains:- 📁
configcontains definitions of functions and domains for generating datasets - 📁
templatecontains names and definitions of input and output for generating datasets
- 📁
- 📁
equationscontains the differential operators library (Operators.py) and, in separate files, the definition of physical loss and dataset pre/post processing criteria for each problem studied - 📁
networkscontains classes for each part of the Bayesian Neural Network. The network built is an instance of the classBayesNN, which inherits methods and attributes fromLossNNandPredNN, having the loss computation and the prediction/post-processing functionalities, respectively. In turn, the above classes inherit fromCoreNN, representing a basic fully connected network).
Network weights and biases are instances of the classTheta, which contains the overloading of some operators for an easier managements of lists of tensors. - 📁
postprocessingis a module with:- the class
Plotterto generate the plots and save them in the folderouts - the class
Storageto store and load results, uncertainty quantification study, loss history and network parameters
- the class
- B-PINNs: Bayesian Physics-Informed Neural Networks for Forward and Inverse PDE Problems with Noisy Data, Liu Yang, Xuhui Meng, George Em Karniadakis, Mar 2020.
- Bayesian Physics Informed Neural Networks for real-world nonlinear dynamical systems, Kevin Linka, Amelie Schäfer, Xuhui Meng, Zongren Zou, George Em Karniadakis, and Ellen Kuhl., May 2022.
- Bayesian Physics-Informed Neural Networks for Inverse Uncertainty Quantification problems in Cardiac Electrophysiology, Master Thesis at Politecnico di Milano by Daniele Ceccarelli.
- Giulia Mescolini (@giuliamesc)
- Luca Sosta (@sostaluca)
- Stefano Pagani (@StefanoPagani)
- Andrea Manzoni