-
Notifications
You must be signed in to change notification settings - Fork 2k
Open
Labels
bugSomething isn't workingSomething isn't working
Description
Checks
- This template is only for bug reports, usage problems go with 'Help Wanted'.
- I have thoroughly reviewed the project documentation but couldn't find information to solve my problem.
- I have searched for existing issues, including closed ones, and couldn't find a solution.
- I am using English to submit this issue to facilitate community communication.
Environment Details
arch
Radeon 8050S
torch==2.5.1+rocm6.2 torchaudio==2.5.1+rocm6.2
Steps to Reproduce
.) Followed instructions for amd gpu
pip install torch==2.5.1+rocm6.2 torchaudio==2.5.1+rocm6.2 --extra-index-url https://download.pytorch.org/whl/rocm6.2
pip install -e .
...
Successfully installed f5-tts-1.1.10
(f5-tts) [user@fw F5-TTS]$ f5-tts_infer-gradio
Download Vocos from huggingface charactr/vocos-mel-24khz
vocab : /home/user/F5-TTS/src/f5_tts/infer/examples/vocab.txt
token : custom
model : /home/user/.cache/huggingface/hub/models--SWivid--F5-TTS/snapshots/84e5a410d9cead4de2f847e7c9369a6440bdfaca/F5TTS_v1_Base/model_1250000.safetensors
Traceback (most recent call last):
File "/home/user/.conda/envs/f5-tts/bin/f5-tts_infer-gradio", line 3, in <module>
from f5_tts.infer.infer_gradio import main
File "/home/user/F5-TTS/src/f5_tts/infer/infer_gradio.py", line 90, in <module>
F5TTS_ema_model = load_f5tts()
^^^^^^^^^^^^
File "/home/user/F5-TTS/src/f5_tts/infer/infer_gradio.py", line 68, in load_f5tts
return load_model(DiT, F5TTS_model_cfg, ckpt_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/F5-TTS/src/f5_tts/infer/utils_infer.py", line 272, in load_model
model = load_checkpoint(model, ckpt_path, device, dtype=dtype, use_ema=use_ema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/user/F5-TTS/src/f5_tts/infer/utils_infer.py", line 197, in load_checkpoint
model = model.to(dtype)
^^^^^^^^^^^^^^^
File "/home/user/.conda/envs/f5-tts/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1340, in to
return self._apply(convert)
^^^^^^^^^^^^^^^^^^^^
File "/home/user/.conda/envs/f5-tts/lib/python3.11/site-packages/torch/nn/modules/module.py", line 900, in _apply
module._apply(fn)
File "/home/user/.conda/envs/f5-tts/lib/python3.11/site-packages/torch/nn/modules/module.py", line 900, in _apply
module._apply(fn)
File "/home/user/.conda/envs/f5-tts/lib/python3.11/site-packages/torch/nn/modules/module.py", line 900, in _apply
module._apply(fn)
[Previous line repeated 1 more time]
File "/home/user/.conda/envs/f5-tts/lib/python3.11/site-packages/torch/nn/modules/module.py", line 927, in _apply
param_applied = fn(param)
^^^^^^^^^
File "/home/user/.conda/envs/f5-tts/lib/python3.11/site-packages/torch/nn/modules/module.py", line 1326, in convert
return t.to(
^^^^^
RuntimeError: HIP error: invalid device function
HIP kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
For debugging consider passing AMD_SERIALIZE_KERNEL=3
Compile with `TORCH_USE_HIP_DSA` to enable device-side assertions.✔️ Expected Behavior
No response
❌ Actual Behavior
No response
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working