You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The advised version on the releases page will simply not install. (did the dev test if it installs on other cards, because the GPU architecture shouldn't make Python deny the installation???)
How you are installing vllm
Using the version on releases (0.19)
Before submitting a new issue...
Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
This discussion was converted from issue #51 on April 06, 2026 20:33.
Heading
Bold
Italic
Quote
Code
Link
Numbered list
Unordered list
Task list
Attach files
Mention
Reference
Menu
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Your current environment
As last time I updated vLLM, be aware that installing for use on a Blackwell GPU (50xx), it WILL require the CUDA 12.8 version of Torch:
https://download-r2.pytorch.org/whl/nightly/cu128/torch-2.11.0.dev20260216%2Bcu128-cp312-cp312-win_amd64.whl
The advised version on the releases page will simply not install. (did the dev test if it installs on other cards, because the GPU architecture shouldn't make Python deny the installation???)
How you are installing vllm
Using the version on releases (0.19)
Before submitting a new issue...
Beta Was this translation helpful? Give feedback.
All reactions