I didn't manage to find someone who used lmstudio as llm server.
It supports vulkan so my iGPU (amd 680m) is enough to run gpt-oss-20b (in lmstudio).
lm studio so as a server functionality but as noob as i am, I can't plug the URL at the right place (1).
Would someone guide me hacking that or just add the lmstudio support feature in Newelle?
1: Tried as ollama instance: nothing happens
Tried as openai endpoint: control panel crashed and vanished for ever.