Skip to content

Implementation for our paper: Single-Step Operator Learning for Conditioned Time-Series Diffusion Models.

Notifications You must be signed in to change notification settings

vsingh-group/SSOL-timeseries

Repository files navigation

SSOL

This repository contains our implementation for the paper: [NeurIPS '25] Single-Step Operator Learning for Conditioned Time-Series Diffusion Models.

Introduction

We propose a diffusion model for time series based on operator learning that reduces the generation problem from the usual multi-step diffusion methods to a single step. Our approach mitigates the frequency-dependent smoothing induced by the forward operator of the diffusion by working in the frequency domain. We learn an inverse semigroup operator that can directly map from high noise levels to clean data in one step, while a frequency-aware block selectively restores the high-frequency components that are systematically suppressed during forward diffusion. The method maintains the mathematical consistency of diffusion processes through semigroup composition properties enforced during training.

Get Started

1. Download the Data

All datasets have been preprocessed and are ready for use. You can obtain them from their original sources:

For convenience, we provide a comprehensive package containing all required datasets, available for download from Google Drive. You can place it under the folder ./data.

2. Setup Your Environment

Create and activate a Python environment using the provided configuration file environment.yml or requirements.txt:

conda env create -f environment.yml -n SSOL
conda activate SSOL
conda create -n SSOL python=3.9
conda activate SSOL
pip install -r requirements.txt

3. Train the Model

Experiment scripts for various benchmarks are provided in the scripts directory. To reproduce the experiment results using our checkpoints, run:

bash ./scripts/run_ckpt.sh

The checkpoints can be downloaded from our shared folder ./save.

You can also train an efficient version of our model using a shallower configuration that employs the same semigroup training methodology and frequency-aware block. To run the efficiency experiments:

bash ./scripts/run_efficiency.sh

Acknowledgement

We appreciate the following GitHub repos a lot for their valuable code and efforts.

Citation

If you find this repo helpful, please cite our paper.

@inproceedings{
chen2025singlestep,
title={Single-Step Operator Learning for Conditioned Time-Series Diffusion Models},
author={Hui Chen and Vikas Singh},
booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
year={2025},
url={https://openreview.net/forum?id=ONUZ08OAZL}
}

About

Implementation for our paper: Single-Step Operator Learning for Conditioned Time-Series Diffusion Models.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published