Skip to content

[Junie]: Junie Unexpected result from torch.xpu.is_bf16_supported() when XPU is unavailable#5

Open
junie-eap[bot] wants to merge 1 commit intomainfrom
junie-issue-3
Open

[Junie]: Junie Unexpected result from torch.xpu.is_bf16_supported() when XPU is unavailable#5
junie-eap[bot] wants to merge 1 commit intomainfrom
junie-issue-3

Conversation

@junie-eap
Copy link
Copy Markdown

@junie-eap junie-eap bot commented Apr 28, 2025

📌 Hey! This PR was made for you with Junie, the coding agent by JetBrains Early Access Preview

It's still learning, developing, and might make mistakes. Please make sure you review the changes before you accept them.
We’d love your feedback — join our Discord to share bugs, ideas: here.

📝 Original Issue Description

  Issue task:
  ### 🐛 Describe the bug

🐛 Describe the bug
When torch.xpu.is_available() returns False, calling torch.xpu.is_bf16_supported() still returns True, which is inconsistent with the expected behavior (should be False).

To Reproduce
import torch
def test_bug():
print('torch.xpu.is_available() =', torch.xpu.is_available())
if not torch.xpu.is_available():
result = torch.xpu.is_bf16_supported()
print('result =', result)

if name == 'main':
test_bug()
Output
torch.xpu.is_available() = False
result = True
Versions
PyTorch version: 2.7.0+cu126
Is debug build: False
CUDA used to build PyTorch: 12.6
ROCM used to build PyTorch: N/A

Versions

please

  Issue comments:
  No comments yet

📊 Junie Summary

The function torch.xpu.is_bf16_supported() was modified to return False when torch.xpu.is_available() returns False, addressing the inconsistency reported. The implementation was logically correct, but the test could not be executed due to a module import error. Thus, while the fix was implemented, the outcome of the test remains unverified.

…d() when XPU is unavailable

 changes from the task: #3
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants