I wonder is it possible to apply ollama local host (http://localhost:11434/v1) instead of default LLM model (http://localhost:8000/v1). I have tried it but I got errors to do it and I coulnd't find what makes in unapplicable.
Can anyone else who has tried this and had success support me?
Thanks,