Skip to content

Conversation

@ZhikangNiu
Copy link
Collaborator

No description provided.

@ZhikangNiu
Copy link
Collaborator Author

1 epoch LibriTTS F5TTS v1 Small

flash_attn(with mask):
Max memory allocated: 30925.52 MB
Max memory reserved: 64118.00 MB
1epoch 25min
loss=0.999

standard F5TTS v1 Small no mask
Max memory allocated: 29942.87 MB
Max memory reserved: 61876.00 MB
1epoch 22min
loss=0.939

@ZhikangNiu ZhikangNiu changed the title [WIP]add flash attn2 support attn_mask Add flash attn2 support attn_mask, fix some minor bug in eval Jun 11, 2025
@SWivid SWivid merged commit 0914170 into SWivid:main Jun 11, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants