In this repository, you will find multiple files with examples for the most popular models supported by diffusers.
These recipes are designed so you can just copy and paste them into your environment.
If you want, you can also run the scripts directly from this repository.
First, clone the project:
git clone https://github.com/asomoza/diffusers-recipes.gitThen you will need to install uv by following the official instructions for your system.
After installing uv, you can install the basic packages with this command:
uv syncThis will use Python 3.12 and install the recommended packages to run Mellon. It may also install torch >2.9 with CUDA >13.0 support, depending on your OS.
Note: This command will also install diffusers from main.
From the console, you can run each script with the following command:
uv run <script>For example:
uv run models/z-image/scripts/base_example.pyFor scripts that use quantization, you will need to install the quantization extras with this command:
uv sync --extra quantizationFor scripts that use a different attention backend, you need to install the attention extras:
uv sync --extra attentionsFor scripts that use video, you need to install the video extras:
uv sync --extra videoThe benchmarks were run on the following systems:
Test Bench #1
AMD Ryzen 7 9800X3D 8-Core Processor
128GB of RAM
RTX 5090 32GB of RAM (undervolted to 480 W)
PCIe Gen5 NVMe 14,900 MB/s