SaiKiran Tedla, Abhijith Punnappurath, Luxi Zhao, Michael S. Brown
Samsung AI Center Toronto & York University
📄 Paper (PDF) 📄 Dataset
If you use our dataset or code, please cite:
@inproceedings{Tedla2025ExaminingDemosaic,
title={{Examining Joint Demosaicing and Denoising for Single-, Quad-, and Nona-Bayer Patterns}},
author={{Tedla, SaiKiran and Punnappurath, Abhijith and Zhao, Luxi and Brown, Michael S}},
booktitle={{Proceedings of the IEEE International Conference on Computational Photography (ICCP)}},
year={{2025}}
}This section describes how to train and test our unified demosaicing and denoising model.
conda env create -f environment.yml
conda activate examine_demosaicPlease copy the DNG provided in the dataset link (into PyTorch/utilities). This DNG is primarily used for visualization and also computing the DeltaE metric.
You'll also need a Weights & Biases (wandb) account for experiment tracking. Update the wandb settings in the YAML files accordingly.
Download the dataset and then update config files accordingly with the appropriate paths.
To train the model, run:
python PyTorch/runner.py --config configs/unified_train.yamlModify the config file to:
- Choose the desired ISO levels
- Set the correct paths to training patches and full-resolution images
- Update your wandb project and entity details
To test the model, run:
python PyTorch/runner.py --config configs/unified_test.yamlAgain, be sure to:
- Set the appropriate ISO levels
- Provide the correct paths to the test dataset
- Provide the path to the appropriate checkpoint file (see models)
- Please set plot_images:True in the config file if you want to get npy and png outputs.