Skip to content

Promotion for Float8 Types is not supported #12

@jmdBB

Description

@jmdBB

Hi there! I enabled this line:

torch_dtype=torch.float8_e4m3fn, # You can settorch_dtype=torch.bfloat16 to disable FP8 quantization.

To enable float8 as I have RTX3090 with 24GB, and I get this error:

FantasyPortrait model load from checkpoint:./models/fantasyportrait_model.ckpt DET stage1: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 197/197 [00:01<00:00, 114.05it/s] DET stage2: 100%|██████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 31/31 [00:00<00:00, 82.81it/s] 0%| | 0/197 [00:00<?, ?it/s]/home/jmacias/fantasy-portrait/diffsynth/models/pdf.py:243: UserWarning: nn.functional.upsampleis deprecated. Usenn.functional.interpolate instead. up2 = F.upsample(low3, size=rescale_size, mode="bilinear") PD_FPG_MOTION: 16%|█████████████████████████▍ | 31/197 [00:01<00:05, 28.22it/s] The height cannot be evenly divided by 16. We round it up to 448. Traceback (most recent call last): File "/home/jmacias/fantasy-portrait/infer.py", line 329, in <module> video_audio = pipe( File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 120, in decorate_context return func(*args, **kwargs) File "/home/jmacias/fantasy-portrait/diffsynth/pipelines/wan_video.py", line 671, in __call__ prompt_emb_posi = self.encode_prompt(prompt, positive=True) File "/home/jmacias/fantasy-portrait/diffsynth/pipelines/wan_video.py", line 526, in encode_prompt prompt_emb = self.prompter.encode_prompt(prompt, positive=positive) File "/home/jmacias/fantasy-portrait/diffsynth/prompters/wan_prompter.py", line 112, in encode_prompt prompt_emb = self.text_encoder(ids, mask) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl return forward_call(*args, **kwargs) File "/home/jmacias/fantasy-portrait/diffsynth/models/wan_video_text_encoder.py", line 271, in forward x = block(x, mask, pos_bias=e) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl return forward_call(*args, **kwargs) File "/home/jmacias/fantasy-portrait/diffsynth/models/wan_video_text_encoder.py", line 147, in forward x = fp16_clamp(x + self.attn(self.norm1(x), mask=mask, pos_bias=e)) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl return forward_call(*args, **kwargs) File "/home/jmacias/fantasy-portrait/diffsynth/vram_management/layers.py", line 61, in forward return module(*args, **kwargs) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1773, in _wrapped_call_impl return self._call_impl(*args, **kwargs) File "/home/jmacias/miniconda3/envs/fantasy-portrait/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1784, in _call_impl return forward_call(*args, **kwargs) File "/home/jmacias/fantasy-portrait/diffsynth/models/wan_video_text_encoder.py", line 37, in forward x = x * torch.rsqrt(x.float().pow(2).mean(dim=-1, keepdim=True) + self.eps) RuntimeError: Promotion for Float8 Types is not supported, attempted to promote Float8_e4m3fn and Float

Does anyone have any idea? By default, the script infer.py has that line comented out, but it is supposed to work if you uncommented it, right?
Thanks
Jorge

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions