Skip to content

Conversation

@ankkprak
Copy link

@ankkprak ankkprak commented Nov 24, 2025

This PR adds support for Pydantic's AzureProvider.

Files changed:

  • codebase_rag/config.py - Added support for reading AZURE_OPEN_AI_API_VERSION Environment variable
  • codebase_rag/main.py - Passing in api_version variable
  • codebase_rag/services/llm.py - Passing in api_version variable
  • codebase_rag/providers/base.py - Adding support for AzureProvider, based on existing implementations. create_model function returns OpenAIModel because most Azure AI Foundry Models (that is to say, models like Grok and Deepseek) don't support the responses API yet. (See: )
  • codebase_rag/tests/test_provider_classes.py and codebase_rag/tests/test_provider_configuration.py: Adds tests for changes
  • .env.example - Added example for usage

The changes were tested with a private Azure AI Foundry deployment with the GPT5 and DeepSeek-V3.1 models.

@gemini-code-assist
Copy link
Contributor

Summary of Changes

Hello @ankkprak, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request significantly enhances the application's flexibility by integrating support for Azure OpenAI models. It allows users to leverage Azure's AI services for both orchestrator and cypher functionalities, providing necessary configuration options for API keys, endpoints, and API versions. The changes ensure that the application can seamlessly connect to and utilize Azure-hosted large language models.

Highlights

  • Azure OpenAI Provider Integration: Introduced a new AzureOpenAIProvider class to support Pydantic's AzureProvider, enabling the use of Azure OpenAI models within the application.
  • API Version Configuration: Added an api_version field to ModelConfig and AZURE_OPEN_AI_API_VERSION to AppConfig to properly configure Azure OpenAI models, ensuring compatibility with specific API versions.
  • Configuration and Validation Updates: Modified configuration loading and validation logic across config.py, main.py, and llm.py to correctly handle the new Azure OpenAI provider and its associated api_version.
  • Expanded Provider Registry: The central PROVIDERS dictionary in base.py now includes azure_openai, making it discoverable and usable by the system.
  • Comprehensive Testing: Added new unit tests and updated existing ones in test_provider_classes.py and test_provider_configuration.py to ensure the correct functionality and configuration of the Azure OpenAI provider.
  • Environment Variable Examples: Updated .env.example with commented-out examples for configuring Azure OpenAI for both orchestrator and cypher roles, including API key, endpoint, and API version.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request successfully adds support for Pydantic's AzureProvider, including changes to configuration, the provider implementation, and associated tests. The implementation is solid, but I've identified a few areas for improvement. My feedback includes suggestions to enhance code readability in the configuration, correct misleading error messages for better user experience, and improve test coverage by adding a missing assertion. Overall, these are good changes that extend the application's capabilities.

Comment on lines +72 to +74
AZURE_OPEN_AI_API_VERSION: str | None = (
None # For models compatible with the OpenAI API, as specified in: https://ai.pydantic.dev/models/overview/#openai-compatible-providers
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The comment for AZURE_OPEN_AI_API_VERSION is placed inside the parentheses of the value assignment. This is an unconventional style and can be confusing to read. For better readability and to adhere to common Python style practices, it's recommended to place the comment on the same line after the assignment or on a separate line before it.

    # OpenAI API Version for Azure
    AZURE_OPEN_AI_API_VERSION: str | None = None  # For models compatible with the OpenAI API, as specified in: https://ai.pydantic.dev/models/overview/#openai-compatible-providers

Comment on lines +283 to +288
# Test cypher Azure OpenAI config
cypher_config = config.active_cypher_config
assert cypher_config.provider == "azure_openai"
assert cypher_config.model_id == "DeepSeek-V3.1"
assert cypher_config.api_key == "test-azure-key"
assert cypher_config.endpoint == "https://resource.openai.azure.com/"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The test for the cypher Azure OpenAI configuration is missing an assertion for api_version. Since AZURE_OPEN_AI_API_VERSION is a global setting, it should be applied to both the orchestrator and cypher configurations when they use the azure_openai provider. Adding this assertion will make the test more complete and ensure the configuration is applied correctly for both roles.

Suggested change
# Test cypher Azure OpenAI config
cypher_config = config.active_cypher_config
assert cypher_config.provider == "azure_openai"
assert cypher_config.model_id == "DeepSeek-V3.1"
assert cypher_config.api_key == "test-azure-key"
assert cypher_config.endpoint == "https://resource.openai.azure.com/"
# Test cypher Azure OpenAI config
cypher_config = config.active_cypher_config
assert cypher_config.provider == "azure_openai"
assert cypher_config.model_id == "DeepSeek-V3.1"
assert cypher_config.api_key == "test-azure-key"
assert cypher_config.endpoint == "https://resource.openai.azure.com/"
assert cypher_config.api_version == "2024-02-15-preview"

Fixed error messages

Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
@vitali87
Copy link
Owner

vitali87 commented Dec 5, 2025

Thanks for this!

PR #161 adds LiteLLM support (coming soon), which handles Azure OpenAI along with 100+ other providers - see https://docs.litellm.ai/docs/providers/azure/

We'd suggest closing this PR to avoid maintaining separate provider classes for each cloud service.

If you think there's value in having a dedicated Azure provider that LiteLLM doesn't cover, let us know - happy to hear your thoughts.

@vitali87 vitali87 linked an issue Dec 5, 2025 that may be closed by this pull request
8 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

Status: No status

Development

Successfully merging this pull request may close these issues.

Enhancement: Add Azure OpenAI Support

2 participants