Replies: 3 comments
-
|
I think it's a great idea, especially because right now in the LLM models it loads them into RAM, and to unload I have to completely close OpenWhispr. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@NivOO5 sorry only seeing this now - looking into it |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
Implemented, soon it'll be in the releases. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
can you add the option to Unload/Offload a downloaded model in the app? it makes things way better and I love this feature in Handy which is another Dictation open source app. it also makes it easy to know if I'm getting bad speed because of that feature or not. thinks for reading :-)
Beta Was this translation helpful? Give feedback.
All reactions