Conversation
Co-authored-by: rrusson <653188+rrusson@users.noreply.github.com>
Co-authored-by: rrusson <653188+rrusson@users.noreply.github.com>
rrusson
left a comment
There was a problem hiding this comment.
The work on SemanticKernelClient is brilliant--you even nailed the tone of the chatbot. Just one suggestion. Can we remove global private variables that are only needed in the CTOR?
Co-authored-by: rrusson <653188+rrusson@users.noreply.github.com>
Removed the three private variables ( |
There was a problem hiding this comment.
Pull request overview
This PR adds API key configuration support to enable remote AI model services (OpenAI, Azure, etc.) and implements chat context persistence so conversations maintain history across interactions. The implementation refactors SemanticKernelClient to initialize the kernel and chat service once in the constructor rather than per-request, and adds an exchange counter that triggers a fatigue response after 10 interactions.
Changes:
- Added optional
ApiKeyconfiguration parameter inappsettings.json(defaults to empty string for local Ollama) - Refactored
SemanticKernelClientto move kernel/service initialization and system prompt to constructor, enabling chat history persistence - Added exchange counter with fatigue response after 10 turns
- Updated dependency injection in
Program.csto pass API key from configuration - Added 6 unit tests covering constructor variants and edge cases
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 10 comments.
Show a summary per file
| File | Description |
|---|---|
| SemanticKernelHelper/SemanticKernelClient.cs | Refactored to move kernel initialization to constructor, added chat history persistence, exchange counter, and API key support |
| ClippyWeb/appsettings.json | Added optional ApiKey configuration parameter |
| ClippyWeb/Program.cs | Updated dependency injection to pass API key to SemanticKernelClient |
| ClippyWeb.Tests/SemanticKernelClientTests.cs | Added unit tests for constructor variants and empty message handling |
| ClippyWeb.Tests/ClippyWeb.Tests.csproj | Added project reference to SemanticKernelHelper for testing |
rrusson
left a comment
There was a problem hiding this comment.
Missed a spot with _kernel, but let's get this merged. Great work!
@Copilot, please do your best to address all unresolved comments on this PR. See my replies and ask for guidance, if necessary. Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: rrusson <653188+rrusson@users.noreply.github.com>
Co-authored-by: rrusson <653188+rrusson@users.noreply.github.com>
YAML is dog shit that should be XML or JSON.
|



Adds optional API key configuration to support remote model services (OpenAI, Azure, etc.), fixes chat context persistence so conversations maintain coherent history across interactions, and isolates chat sessions per user IP address to prevent cross-contamination.
Changes
Configuration: Added optional
ApiKeyparameter toappsettings.json(defaults to empty string for local Ollama)SemanticKernelClient refactor:
apiKeyparameter with comprehensive validation_chatHistoryand_exchangeCount.ConfigureAwait(false)to async foreach for consistency with codebaseSession Isolation:
IChatClientFactoryinterface andChatClientFactoryimplementationSemanticKernelClientinstance with isolated chat history and exchange counterProgram.csto register factory instead of singleton clientChatControllerto use factory and retrieve client by IP addressTests: Added comprehensive unit tests (53 total):
Usage
For remote models, configure the API key:
{ "ServiceUrl": "https://api.openai.com/v1", "Model": "gpt-4", "ApiKey": "sk-..." }For local Ollama (existing behavior):
{ "ServiceUrl": "http://localhost:11434/v1", "Model": "neuraldaredevil-abliterated", "ApiKey": "" }Security
Original prompt
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.