Official implementation of the paper "Parametric Pareto Set Learning: Amortizing Multi-Objective Optimization with Parameters", published in IEEE Transactions on Evolutionary Computation.
PPSL addresses the challenge of solving an infinite number of multi-objective optimization problems where optimal solutions must adapt to varying parameters. Unlike traditional methods that generate only finite solution sets, PPSL learns a unified mapping from parameters to the entire Pareto set, enabling real-time inference of optimal solutions for any parameter setting.
- 🚀 Amortized Optimization: Shifts computational burden from online solving to offline training
- 🧠 Hypernetwork Architecture: Generates PS model parameters conditioned on input parameters
- ⚡ Low-Rank Adaptation (LoRA): Achieves computational efficiency and scalability
- 🎯 Continuous Pareto Set Learning: Captures the entire Pareto set structure across parameter space
- 🔄 Real-time Inference: Millisecond-level solution generation after training
- Dynamic Multiobjective Optimization Problems (DMOPs): Where objectives change over time
- Multiobjective Optimization with Shared Components: Where design variables must share identical settings for manufacturing efficiency
- Python ≥ 3.8
- PyTorch ≥ 1.12 (GPU supported; CUDA optional)
- Clone the repository:
git clone (available upon acceptance)
cd ppsl- Install dependencies:
pip install -r requirements.txttorch>=1.12.0
numpy>=1.21.0
pymoo>=0.6.0
matplotlib>=3.5.0
scipy>=1.7.0Explore PPSL through our Jupyter notebook:
jupyter notebook example.ipynbppsl/
├── experiment_dmop.py # Experiments for Dynamic MOPs
├── experiment_mopsc.py # Experiments for MOPs with Shared Components
├── trainer.py # Training methods (fixed/random/black-box)
├── model.py # Hypernetwork and PS models (LoRA/non-LoRA)
├── example.ipynb # Interactive demonstration
├── problems/ # Problem definitions
│ ├── problem_dyn.py # Dynamic MOP benchmarks
│ └── problem_f_re.py # RE problems
├── results/ # Experimental results
└── requirements.txt # Package dependenciesIf you find this work useful, please cite our paper:
@article{ppsl2024,
title={Parametric Pareto Set Learning: Amortizing Multi-Objective Optimization with Parameters},
author={Cheng, Ji and Lin, Xi and Xue, Bo and Zhang, Qingfu},
journal={IEEE Transactions on Evolutionary Computation},
year={2025},
volume={},
number={},
pages={1-1},
}- Built on PyMOO framework
- Inspired by recent advances in amortized optimization and Pareto set learning