Skip to content

A Jax implementation of the torchcfm library for continuous flow matching models

License

Notifications You must be signed in to change notification settings

bayesianempirimancer/jaxcfm

Repository files navigation

JAXCFM: A JAX Reimplementation of TorchCFM

JAX/Flax Implementation of Conditional Flow Matching

This repository contains a JAX/Flax reimplementation of the TorchCFM library, providing a pure JAX implementation of Conditional Flow Matching methods with full JIT compilation support.

OT-CFM Preprint SF2M Preprint JAX Flax Optax Diffrax black pre-commit license

Description

JAXCFM is a JAX/Flax reimplementation of the TorchCFM library, originally developed by Alexander Tong, Kilian Fatras, and collaborators. This implementation provides:

  • Pure JAX implementation with full JIT compilation support
  • No PyTorch dependencies - completely framework-agnostic for the core algorithms
  • Optimized for JAX - uses lax.scan, lax.while_loop, and other JAX primitives for efficient execution
  • All original features - implements all Flow Matching variants from the original TorchCFM

Conditional Flow Matching (CFM) is a fast way to train continuous normalizing flow (CNF) models. CFM is a simulation-free training objective for continuous normalizing flows that allows conditional generative modeling and speeds up training and inference. CFM's performance closes the gap between CNFs and diffusion models.

Acknowledgments and Credits

This project is a JAX reimplementation of the excellent TorchCFM library. We are deeply grateful to the original TorchCFM team for their groundbreaking work:

This JAX implementation maintains the same API and functionality as the original PyTorch version, making it easy to switch between frameworks. All algorithms, architectures, and methodologies are based on the original TorchCFM implementation.

The density, vector field, and trajectories of simulation-free CNF training schemes: mapping 8 Gaussians to two moons (above) and a single Gaussian to two moons (below). Action matching with the same architecture (3x64 MLP with SeLU activations) underfits with the ReLU, SiLU, and SiLU activations as suggested in the example code, but it seems to fit better under our training setup (Action-Matching (Swish)).

The models to produce the GIFs are stored in examples/models and can be visualized with the model comparison plotting notebook in examples/2D_tutorials/model-comparison-plotting.ipynb.

The jaxcfm Package

This JAX reimplementation provides the jaxcfm package, which mirrors the functionality of the original torchcfm package but uses JAX/Flax instead of PyTorch. The package allows abstraction of the choice of the conditional distribution q(z). jaxcfm supplies the following loss functions (matching the original TorchCFM API):

  • ConditionalFlowMatcher: $z = (x_0, x_1)$, $q(z) = q(x_0) q(x_1)$
  • ExactOptimalTransportConditionalFlowMatcher: $z = (x_0, x_1)$, $q(z) = \pi(x_0, x_1)$ where $\pi$ is an exact optimal transport joint. This is used in [Tong et al. 2023a] and [Poolidan et al. 2023] as "OT-CFM" and "Multisample FM with Batch OT" respectively.
  • TargetConditionalFlowMatcher: $z = x_1$, $q(z) = q(x_1)$ as defined in Lipman et al. 2023, learns a flow from a standard normal Gaussian to data using conditional flows which optimally transport the Gaussian to the datapoint (Note that this does not result in the marginal flow being optimal transport).
  • SchrodingerBridgeConditionalFlowMatcher: $z = (x_0, x_1)$, $q(z) = \pi_\epsilon(x_0, x_1)$ where $\pi_\epsilon$ is an entropically regularized OT plan, although in practice this is often approximated by a minibatch OT plan (See Tong et al. 2023b). The flow-matching variant of this where the marginals are equivalent to the Schrodinger Bridge marginals is known as SB-CFM [Tong et al. 2023a]. When the score is also known and the bridge is stochastic is called [SF]2M [Tong et al. 2023b]
  • VariancePreservingConditionalFlowMatcher: $z = (x_0, x_1)$ $q(z) = q(x_0) q(x_1)$ but with conditional Gaussian probability paths which preserve variance over time using a trigonometric interpolation as presented in [Albergo et al. 2023a].

Citation

Please cite the original TorchCFM papers when using this JAX implementation. This repository is a reimplementation of the code from the original TorchCFM repository, which contains the code to reproduce the main experiments and illustrations of two preprints:

If you find this code useful in your research, please cite the following papers (expand for BibTeX):

A. Tong, N. Malkin, G. Huguet, Y. Zhang, J. Rector-Brooks, K. Fatras, G. Wolf, Y. Bengio. Improving and Generalizing Flow-Based Generative Models with Minibatch Optimal Transport, 2023.
@article{tong2024improving,
title={Improving and generalizing flow-based generative models with minibatch optimal transport},
author={Alexander Tong and Kilian FATRAS and Nikolay Malkin and Guillaume Huguet and Yanlei Zhang and Jarrid Rector-Brooks and Guy Wolf and Yoshua Bengio},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2024},
url={https://openreview.net/forum?id=CD9Snc73AW},
note={Expert Certification}
}
A. Tong, N. Malkin, K. Fatras, L. Atanackovic, Y. Zhang, G. Huguet, G. Wolf, Y. Bengio. Simulation-Free Schrödinger Bridges via Score and Flow Matching, 2023.
@article{tong2023simulation,
   title={Simulation-Free Schr{\"o}dinger Bridges via Score and Flow Matching},
   author={Tong, Alexander and Malkin, Nikolay and Fatras, Kilian and Atanackovic, Lazar and Zhang, Yanlei and Huguet, Guillaume and Wolf, Guy and Bengio, Yoshua},
   year={2023},
   journal={arXiv preprint 2307.03672}
}

V0 -> V1

Major Changes:

  • Added cifar10 examples with an FID of 3.5
  • Added code for the new Simulation-free Score and Flow Matching (SF)2M preprint
  • Created torchcfm pip installable package
  • Moved pytorch-lightning implementation and experiments to runner directory
  • Moved notebooks -> examples
  • Added image generation implementation in both lightning and a notebook in examples

Implemented papers

List of implemented papers:

  • Flow Matching for Generative Modeling (Lipman et al. 2023) Paper
  • Flow Straight and Fast: Learning to Generate and Transfer Data with Rectified Flow (Liu et al. 2023) Paper Code
  • Building Normalizing Flows with Stochastic Interpolants (Albergo et al. 2023a) Paper
  • Action Matching: Learning Stochastic Dynamics From Samples (Neklyudov et al. 2022) Paper Code
  • Concurrent work to our OT-CFM method: Multisample Flow Matching: Straightening Flows with Minibatch Couplings (Pooladian et al. 2023) Paper
  • Generating and Imputing Tabular Data via Diffusion and Flow-based Gradient-Boosted Trees (Jolicoeur-Martineau et al.) Paper Code
  • Soon: SE(3)-Stochastic Flow Matching for Protein Backbone Generation (Bose et al.) Paper

How to run

This JAX implementation provides the same functionality as the original TorchCFM, but uses JAX/Flax instead of PyTorch. To install and use:

# clone project
git clone https://github.com/bayesianempirimancer/jaxcfm.git
cd jaxcfm

# [OPTIONAL] create conda environment
conda create -n jaxcfm python=3.10
conda activate jaxcfm

# install JAX according to instructions
# https://jax.readthedocs.io/en/latest/installation.html

# install requirements
pip install -r requirements.txt

# install jaxcfm
pip install -e .

Note: For the original PyTorch implementation, see the official TorchCFM repository and install via pip install torchcfm.

To run our jupyter notebooks, use the following commands after installing our package.

# install ipykernel
conda install -c anaconda ipykernel

# install conda env in jupyter notebook
python -m ipykernel install --user --name=jaxcfm

# launch our notebooks with the jaxcfm kernel

Project Structure

The directory structure looks like this:

│
├── examples              <- Jupyter notebooks (JAX versions)
│   ├── 2D_tutorials      <- 2D flow matching tutorials (JAX)
│   ├── images             <- Image generation examples (JAX)
│
├── jaxcfm                  <- JAX/Flax implementation of Flow Matching methods
│   ├── conditional_flow_matching.py      <- CFM classes (JAX version)
│   ├── optimal_transport.py              <- Pure JAX OT implementations
│   ├── models                            <- Model architectures (Flax)
│   │   ├── models                           <- Models for 2D examples
│   │   ├── unet                            <- UNet models for image examples
│
├── torchcfm                  <- Original PyTorch implementation (for reference)
│   ├── conditional_flow_matching.py      <- Original CFM classes (PyTorch)
│   ├── models                            <- Original model architectures (PyTorch)
│
├── runner                    <- Everything related to the original version (V0) of the library
│
├── .gitignore                <- List of files ignored by git
├── .pre-commit-config.yaml   <- Configuration of pre-commit hooks for code formatting
├── pyproject.toml            <- Configuration options for testing and linting
├── requirements.txt          <- File for installing python dependencies
├── setup.py                  <- File for installing project as a package
└── README.md

❤️ Credits and Acknowledgments

Original TorchCFM Team

This JAX implementation is based on the excellent work of the original TorchCFM team:

  • Alexander Tong - Original TorchCFM creator and maintainer
  • Kilian Fatras - Original TorchCFM co-creator and maintainer
  • All TorchCFM contributors - For the original PyTorch implementation and research

The original TorchCFM repository can be found at: https://github.com/atong01/conditional-flow-matching

JAX Reimplementation

This JAX/Flax reimplementation maintains the same API and functionality as the original, making it easy to switch between PyTorch and JAX frameworks. All algorithms, methodologies, and research contributions are based on the original TorchCFM work.

Key differences from TorchCFM:

  • Pure JAX/Flax implementation (no PyTorch dependencies)
  • Full JIT compilation support
  • Uses JAX primitives (lax.scan, lax.while_loop) for efficiency
  • Pure JAX implementations of optimal transport algorithms (no NumPy/scipy dependencies in core code)

Contributing

Before making an issue, please verify that:

  • The problem still exists on the current main branch.
  • Your python dependencies are updated to recent versions.
  • For issues related to the original algorithms, consider also checking the original TorchCFM repository.

Suggestions for improvements are always welcome!

License

Conditional-Flow-Matching is licensed under the MIT License.

MIT License

Copyright (c) 2023 Alexander Tong (Original TorchCFM)
Copyright (c) 2024 JAXCFM Contributors (JAX Reimplementation)

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

About

A Jax implementation of the torchcfm library for continuous flow matching models

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 15

Languages