-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Open
Labels
backendBackend Task (python)Backend Task (python)
Description
Currently, we are tracking LLM usage solely through the OpenAI dashboard (we use OpenAI’s top-notch models to bring the highest value to users, assuming that OpenAI is currently the best option for Omi users’ use cases).
The problem is that there’s no way to monitor usage by feature. If a new feature causes high usage, or an existing feature is abused due to a bad code merge, it’s difficult to detect and fix the issue. Our OpenAI bill could explode.
That’s why we need to track LLM usage at the feature level: which features are in the top 3 for LLM usage, and by how much?
Use the keywords utils llm, processing conversation in the backend folder to find the relevant code.
Metadata
Metadata
Assignees
Labels
backendBackend Task (python)Backend Task (python)