This branch pertains the imeplemntation of the accepted version. For the implementation of the extension manuscript that is currently under review, please refer to the arterialnet+ branch.
This is the code Implementation for our accepted IEEE BHI-2023 manuscript: ArterialNet: Arterial Blood Pressure Reconstruction.
Link to publication.
Corresponding Author: Sicong Huang.
ArterialNet is a pre-training framework that can be paired with any deep learning sequence-to-sequence model for arterial blood pressure (ABP) reconstruction. Here we demonstrate the effectiveness of ArterialNet by pairing it with two different backbone architectures: U-Net and Transformer. We evaluated ArterialNet on the MIMIC III Waveform Dataset and showed improved performance on both U-Net and Transformer backbones. Please refer to our BHI paper for full details.
Our experiments are conducted on a Linux-based machine with the following specifications:
- Linux-based OS
- Python 3.9.15
- conda 4.14.0
- PyTorch 1.11.0
- git 2.25.1
- CUDA 11.4 or 11.6 (for GPU acceleration)
We highly recommend you to use the conda environment (arterialnet.yml) we shared to avoid potential compatibility issues. To set up Conda for your computer, you can follow the official instructions here.
Command Line Input Steps:
-
git pull https://github.com/stmilab/ArterialNet.gitclones the repository to your local machine -
cd ArterialNet/changes the directory to the repository -
conda env create -f arterialnet.ymlcreates a new conda environment the same as ours (arterialnet.ymlcontains the packages used for our experiments.) -
conda activate arterialnetactivates the created conda environment you just created
ArterialNet has 3 major components:
- Feature Extractor and backbone
- Hybrid Loss Function
- Subject-Invariant Regularization
-
models/arterialnet.pycontains the Feature Extractor and Backbone of our proposed ArterialNet framework along with the U-Net and Transformer backbone architectures.-
U-Net implementation is modified from Seq-U-Net in Pytorch
-
Transformer implementation is based on PyTorch Transformer
-
-
Hybrid Loss Function is implemented in a custom
train_epoch()function inrun_torch_sequnet.pyandrun_torch_transformer.py -
Subject-Invariant Regularization is implemented in a custom
rex_preprocess()inutils/rex_utils.pyand modified from REx
python run_torch_sequnet_rex.pyruns the ArterialNet + U-Net model on MIMIC III Waveform Dataset (see below for more details)
run_torch_sequnet_rex.py is the implementation of ArterialNet with U-Net as the backbone for reconstructing ABP for MIMIC patients
run_torch_sequnet.py is the base version without subject-invariant regularization
run_torch_transformer_rex.py is the implementation of ArterialNet with Transformer as the backbone for reconstructing ABP for MIMIC patients
run_torch_transformer.py is the base version without subject-invariant regularization
Hyperparameter Tuning scope for MIMIC is here: mimic_hyperparam.txt
Hyperparameter Tuning scope for BioZ is here: bioz_hyperparam.txt
-
You can request and download the MIMIC-III ICU Waveform from here.
-
Select your cohort of patients and download. For example:
mimic_file_list.txt -
Specify argument
--data_pathwith your data path, otherwise change the default value inarg_parser.py
Please refer to this paper.
utils/visual_combine.py has the following evaluation metrics implemented and ready to be used:
- ABP Waveform:
- Root Mean Squared Error (RMSE)
- Mean Absolute Error (MAE)
- Pearson's Correlation Coefficient (R)
- Waveform Reconstruction vs. Reference Plot
- SBP/DBP (all of the above plus):
- Bland-Altman Plots
- Confusion Matrix of hypertension stages
- SBP/DBP Prediction vs. Reference Plots
