Skip to content

Conversation

@starkwj
Copy link
Contributor

@starkwj starkwj commented Jun 11, 2025

With CFG, the DiT model forward twice, once with cond and once without.
However, the two forwards can be batched and computed together, which is exactly what this PR does.
This PR can nearly double the inference performance.

@SWivid
Copy link
Owner

SWivid commented Jun 11, 2025

Sure, to have certain speed-up for hpc devices (if utilization rate not full at single batch).

Will merge after several fixes for normal batch inference (e.g. set batchsize>1 in eval_infer_batch.py).

@SWivid SWivid merged commit 8975fca into SWivid:main Jun 11, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants