Refactor default system prompt to be closer to LLM client creation#106
Refactor default system prompt to be closer to LLM client creation#106jgieringer wants to merge 3 commits intomainfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR standardizes the default system prompt across all LLM clients by moving the default into LLMInterface, so callers that omit system_prompt now get "You are a helpful AI assistant." automatically (unless they explicitly pass an empty string to disable it).
Changes:
- Introduces
DEFAULT_SYSTEM_PROMPTinLLMInterfaceand applies it whensystem_prompt is None(preserving""as “no prompt”). - Updates
ConversationRunnerto pass throughsystem_promptfrom config without injecting its own fallback string. - Updates unit/integration tests and docs to reflect the new default behavior and the “explicit empty string disables prompt” semantics.
Reviewed changes
Copilot reviewed 10 out of 10 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
llm_clients/llm_interface.py |
Adds DEFAULT_SYSTEM_PROMPT and changes initialization semantics to default on None (not on falsy). |
generate_conversations/runner.py |
Removes runner-level default prompt injection and relies on LLMInterface defaulting. |
tests/integration/test_conversation_runner.py |
Adjusts integration expectations: runner passes None, base class applies default. |
tests/unit/llm_clients/test_llm_interface.py |
Adds coverage for default-on-None and preserving explicit empty string. |
tests/unit/llm_clients/test_openai_llm.py |
Updates “no system prompt” test to pass system_prompt="". |
tests/unit/llm_clients/test_claude_llm.py |
Updates “no system prompt” test to pass system_prompt="". |
tests/unit/llm_clients/test_gemini_llm.py |
Updates “no system prompt” test to pass system_prompt="". |
tests/unit/llm_clients/test_azure_llm.py |
Updates “no system prompt” test to pass system_prompt="". |
tests/unit/llm_clients/test_ollama_llm.py |
Updates several tests to explicitly disable system prompts where needed; asserts default prompt when None. |
docs/evaluating.md |
Clarifies conversation history flow and refines conversation_id wording. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 11 out of 11 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
So... if I want to run an API endpoint without a system prompt... do I just leave that out of my generate response? |
Yes! Since all of our work up-to-now has been on raw LLM calls, system prompt has been required. I did something similar with start_prompt in the custom api LLM client where I set I could add another level of abstraction that is more for raw LLM clients that require system messages and such, but I wonder if it's too much? |
I might not be understanding what's happening, but if you don't include one, will evert to the default? |
Description
This PR un-buries the default system prompt. Now all LLM clients have the default of "You are a helpful AI assistant." unless overridden.
Issue
Resolves comment: https://github.com/SpringCare/VERA-MH/pull/104/changes/BASE..d4debf20595196156b6bc49c7608675379b5dd85#r2796492477