Currently, I am using your openai-compatible-conversation integration, and everything works perfectly. However, after discovering your current custom integration, I tried it out and found that it might be a bit too complex for me. Also, I seem to notice that the same LLM responds slightly faster in the openai-compatible-conversation integration than in the custom integration, though I'm not sure. I still want to keep things simple, but it seems that the openai-compatible-conversation integration is no longer maintained. Thank you very much for your efforts.
Currently, I am using your openai-compatible-conversation integration, and everything works perfectly. However, after discovering your current custom integration, I tried it out and found that it might be a bit too complex for me. Also, I seem to notice that the same LLM responds slightly faster in the openai-compatible-conversation integration than in the custom integration, though I'm not sure. I still want to keep things simple, but it seems that the openai-compatible-conversation integration is no longer maintained. Thank you very much for your efforts.