-
Notifications
You must be signed in to change notification settings - Fork 49
Closed
Labels
featureproposal to add a new featureproposal to add a new feature
Description
PROBLEM
Currently sherpa can only interface with Open AI API which is very limiting in terms of development and testing, and also the range of use cases it can handle (eg. using local files as knowledge base without sending info to third part providers)
SOLUTION
Refactor LLM handling components into its own module and in parallel research and set up a library that allows setting up local LLMs as API (eg. https://ollama.ai/).
Challenges:
- how will this impact default prompts in the system? do we need to keep track of several set of prompts?
- for deployed sherpa we will continue using open ai - will this switch create too much logistical and manual steps between dev and deployment?
- what tests and evaluation gaurdrails are necessary to ensure the system doesn't run into a lot of integration errors and misbehaviors.
ALTERNATIVES
One idea we considered was figuring out a way to use open ai for free or cheaper (for example through their research grant programs). this does not solve the latter problem mentioend above (local use) but also might take a long time to acquire.
OTHER INFO
n/a
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
featureproposal to add a new featureproposal to add a new feature