Anthropic changed their python sdk - making this code line outdated.
|
c = anthropic.Client(os.environ["ANTHROPIC_API_KEY"]) |
Would love to know if this might help - https://github.com/BerriAI/litellm
~Simple I/O library, that standardizes all the llm api calls to the OpenAI call
from litellm import completion
## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["ANTHROPIC_API_KEY"] = "anthropic key"
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# anthropic call
response = completion("claude-v-2", messages)
Anthropic changed their python sdk - making this code line outdated.
flacuna/fastchat/llm_judge/common.py
Line 440 in 8a712d8
Would love to know if this might help - https://github.com/BerriAI/litellm
~Simple I/O library, that standardizes all the llm api calls to the OpenAI call