Conversation
|
Thanks for making our package The reason I use the argument Ideally I would like to have different extras for various targets, i.e., have
|
|
I didn't know that the default ones come with cuda, it actually didn't work for me originally but I suppose it was the other fix that fixed cuda not working 😅. Given that it works by default with cuda, this is probably not an issue whatsoever. Nonetheless, here is what I found: There is For uv, it is possible to make torch* optional dependencies ([cuda], [cpu], [rocm]) and have them select different sources. Unfortunately however, you lose the "default". I.e. you have to always choose one of them explicitly when installing it with uv ( |
|
It is unfortunate that |
|
I got uv install working usign |
|
What do you think about 314361b?
|
|
That's a good find Patrik. I think the new section looks good. I suppose the reason to not specifically mention that one doesn't have to use the Also a bit tangential but the reason I was doing this in the first place was to make it work on my NixOS. Whilst I would be surprised if anyone else in the class also used NixOS, I could share the "flake" for it so any other NixOS user could use it easily. If you wanted to do that, there are basically two options:
In the end, this would allow anyone on NixOS (or I believe using the nix package manager as well) to then use (Also the option of not doing either of that is fine.) |
This code makes sure that
uv syncpulls all torch packages (except for torchmetrics) from https://download.pytorch.org/whl/cu128