-
-
Notifications
You must be signed in to change notification settings - Fork 262
Feat: Add support for Pydantic's AzureProvider #168
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Summary of ChangesHello @ankkprak, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request significantly enhances the application's flexibility by integrating support for Azure OpenAI models. It allows users to leverage Azure's AI services for both orchestrator and cypher functionalities, providing necessary configuration options for API keys, endpoints, and API versions. The changes ensure that the application can seamlessly connect to and utilize Azure-hosted large language models. Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request successfully adds support for Pydantic's AzureProvider, including changes to configuration, the provider implementation, and associated tests. The implementation is solid, but I've identified a few areas for improvement. My feedback includes suggestions to enhance code readability in the configuration, correct misleading error messages for better user experience, and improve test coverage by adding a missing assertion. Overall, these are good changes that extend the application's capabilities.
| AZURE_OPEN_AI_API_VERSION: str | None = ( | ||
| None # For models compatible with the OpenAI API, as specified in: https://ai.pydantic.dev/models/overview/#openai-compatible-providers | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The comment for AZURE_OPEN_AI_API_VERSION is placed inside the parentheses of the value assignment. This is an unconventional style and can be confusing to read. For better readability and to adhere to common Python style practices, it's recommended to place the comment on the same line after the assignment or on a separate line before it.
# OpenAI API Version for Azure
AZURE_OPEN_AI_API_VERSION: str | None = None # For models compatible with the OpenAI API, as specified in: https://ai.pydantic.dev/models/overview/#openai-compatible-providers| # Test cypher Azure OpenAI config | ||
| cypher_config = config.active_cypher_config | ||
| assert cypher_config.provider == "azure_openai" | ||
| assert cypher_config.model_id == "DeepSeek-V3.1" | ||
| assert cypher_config.api_key == "test-azure-key" | ||
| assert cypher_config.endpoint == "https://resource.openai.azure.com/" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The test for the cypher Azure OpenAI configuration is missing an assertion for api_version. Since AZURE_OPEN_AI_API_VERSION is a global setting, it should be applied to both the orchestrator and cypher configurations when they use the azure_openai provider. Adding this assertion will make the test more complete and ensure the configuration is applied correctly for both roles.
| # Test cypher Azure OpenAI config | |
| cypher_config = config.active_cypher_config | |
| assert cypher_config.provider == "azure_openai" | |
| assert cypher_config.model_id == "DeepSeek-V3.1" | |
| assert cypher_config.api_key == "test-azure-key" | |
| assert cypher_config.endpoint == "https://resource.openai.azure.com/" | |
| # Test cypher Azure OpenAI config | |
| cypher_config = config.active_cypher_config | |
| assert cypher_config.provider == "azure_openai" | |
| assert cypher_config.model_id == "DeepSeek-V3.1" | |
| assert cypher_config.api_key == "test-azure-key" | |
| assert cypher_config.endpoint == "https://resource.openai.azure.com/" | |
| assert cypher_config.api_version == "2024-02-15-preview" |
Fixed error messages Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
|
Thanks for this! PR #161 adds LiteLLM support (coming soon), which handles Azure OpenAI along with 100+ other providers - see https://docs.litellm.ai/docs/providers/azure/ We'd suggest closing this PR to avoid maintaining separate provider classes for each cloud service. If you think there's value in having a dedicated Azure provider that LiteLLM doesn't cover, let us know - happy to hear your thoughts. |
This PR adds support for Pydantic's AzureProvider.
Files changed:
codebase_rag/config.py- Added support for readingAZURE_OPEN_AI_API_VERSIONEnvironment variablecodebase_rag/main.py- Passing inapi_versionvariablecodebase_rag/services/llm.py- Passing inapi_versionvariablecodebase_rag/providers/base.py- Adding support for AzureProvider, based on existing implementations.create_modelfunction returnsOpenAIModelbecause most Azure AI Foundry Models (that is to say, models like Grok and Deepseek) don't support the responses API yet. (See: )codebase_rag/tests/test_provider_classes.pyandcodebase_rag/tests/test_provider_configuration.py: Adds tests for changes.env.example- Added example for usageThe changes were tested with a private Azure AI Foundry deployment with the GPT5 and DeepSeek-V3.1 models.