🐛 Describe the bug
🐛 Describe the bug
When torch.xpu.is_available() returns False, calling torch.xpu.is_bf16_supported() still returns True, which is inconsistent with the expected behavior (should be False).
To Reproduce
import torch
def test_bug():
print('torch.xpu.is_available() =', torch.xpu.is_available())
if not torch.xpu.is_available():
result = torch.xpu.is_bf16_supported()
print('result =', result)
if name == 'main':
test_bug()
Output
torch.xpu.is_available() = False
result = True
Versions
PyTorch version: 2.7.0+cu126
Is debug build: False
CUDA used to build PyTorch: 12.6
ROCM used to build PyTorch: N/A
Versions
please
🐛 Describe the bug
🐛 Describe the bug
When torch.xpu.is_available() returns False, calling torch.xpu.is_bf16_supported() still returns True, which is inconsistent with the expected behavior (should be False).
To Reproduce
import torch
def test_bug():
print('torch.xpu.is_available() =', torch.xpu.is_available())
if not torch.xpu.is_available():
result = torch.xpu.is_bf16_supported()
print('result =', result)
if name == 'main':
test_bug()
Output
torch.xpu.is_available() = False
result = True
Versions
PyTorch version: 2.7.0+cu126
Is debug build: False
CUDA used to build PyTorch: 12.6
ROCM used to build PyTorch: N/A
Versions
please