Cuda out of memory!! #415
-
|
After multiple image generations and mask generations the free gpu memory keeps decreasing until i get cuda of memory error, is there a way to free memory after each request ?? @mrhan1993 |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments 1 reply
-
|
I'll try to deal with this later |
Beta Was this translation helpful? Give feedback.
-
|
I have added an Endpoint, |
Beta Was this translation helpful? Give feedback.
-
|
Is this not available in the docker container yet? My end goal would be to release the vRam so that the GPU gets back to P8 state (At p8 state it consumes 10W, while with the model loaded it consumes 52W) |
Beta Was this translation helpful? Give feedback.

I have added an Endpoint,
/v1/engines/clean_vram, which you can access to manually release VRAM usage