Skip to content
/ PCD Public

[ICLR 2026 Oral] Official implementation of Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization

Notifications You must be signed in to change notification settings

jatan12/PCD

Repository files navigation

Pareto-Conditioned Diffusion Models for
Offline Multi-Objective Optimization

ICLR 2026 (Oral)

teaser_page1

This is the official PyTorch implementation of Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization, presented at ICLR 2026.

Pareto-Conditioned Diffusion (PCD) reframes offline multi-objective optimization as a conditional sampling problem.

  • Training: Employs a novel reweighting strategy to emphasize high-quality solutions near the Pareto front.
  • Sampling: Directly generates novel designs conditioned on target objectives, sidestepping the need for explicit surrogate models.

main table from paper

PCD achieves highly competitive performance and demonstrates greater consistency across diverse tasks than existing approaches, using a single set of hyperparameters.

Installation

This code base builds on top off Offline-moo, and thus one needs to install it first. To make this process easier, the exact version of offline-moo used in our case is included in offline_moo/data.

Begin the process by installing requirements for offline-moo

cd offline_moo
conda env create -f environment.yml
conda activate off-moo
conda install gxx_linux-64 gcc_linux-64
conda install --channel=conda-forge libxcrypt

# Install requirements  from pip
conda install -r requirements.txt

conda install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 --index-url https://download.pytorch.org/whl/cu118

pip install scipy==1.10.1
pip install scikit-learn==0.21.3
pip install --upgrade pandas
pip install --upgrade kiwisolver

# Dependencies specific to our code  base
pip install gin-config==0.5.0
pip install einops==0.8.0
pip install torchdiffeq==0.2.5
pip install pygmo==2.19.5
pip install accelerate==1.0.1
pip install wandb==0.19.6

This should cover the basic installation. However, some of the tasks, such as scientific design and MORL tasks require additional setup. For these we refer the reader to the official instructions from Offline-moo

Caution

Due to the complicated nature of the dependencies required by the different tasks in offline-moo, we found that it is easier to create separate environments for each subtask that requires additional software. Your mileage may vary!

After installing the required dependencies, download the offline data from google-drive and place them in offline_moo/data. (Note: experiments shown in the paper utilized the data_fix_250508 version of the dataset.)

Reproducing results

Below are few examples to get you started

Train & evaluate PCDiffusion in ZDT2.

python train.py --task_name zdt2 --seed 1000 --domain synthetic --sampling-method 'reference-direction' --sampling-guidance-scale 2.5 --reweight-loss --experiment_name "reweight-ref-dir" --save-dir path/to/your_dir

Use data pruning instead of dataset reweighing in MORL

python train.py --task_name mo_hopper_v2 --seed 1000 --domain morl --sampling-method 'reference-direction' --sampling-guidance-scale 2.5 --data_pruning --experiment_name "pruning-ref-dir" --save-dir path/to/your_dir

Use simple condition mechanism without any data-processing (Ideal + N/A from table 2) in MONAS

python train.py --task_name c10mop2 --seed 2000 --domain monas --sampling-method 'uniform-ideal' --sampling-guidance-scale 2.5 --data_pruning --experiment_name "pruning-ref-dir" --save-dir path/to/your_dir

The results from the paper are performed for all tasks & with seeds 1000, 2000, 3000, 4000, 5000

📖 BibTeX

@inproceedings{
	shrestha2026paretoconditioned,
	title={Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization},
	author={Jatan Shrestha and Santeri Heiskanen and Kari Hepola and Severi Rissanen and Pekka J{\"a}{\"a}skel{\"a}inen and Joni Pajarinen},
	booktitle={The Fourteenth International Conference on Learning Representations},
	year={2026},
	url={https://openreview.net/forum?id=S2Q00li155}
}

About

[ICLR 2026 Oral] Official implementation of Pareto-Conditioned Diffusion Models for Offline Multi-Objective Optimization

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •