Skip to content

fix: show warnings when no LLM provider configured for prompt actions#12

Merged
SeoFood merged 1 commit intomainfrom
issue-8
Apr 5, 2026
Merged

fix: show warnings when no LLM provider configured for prompt actions#12
SeoFood merged 1 commit intomainfrom
issue-8

Conversation

@SeoFood
Copy link
Copy Markdown
Contributor

@SeoFood SeoFood commented Apr 3, 2026

Summary

  • Show a MessageBox warning when executing a prompt action with no LLM provider available, instead of silently doing nothing (fixes Prompt actions require manual LLM provider configuration per action #8)
  • Add per-action provider override dropdown to the prompt editor UI (the data model already supported it but the UI was missing)
  • Make the "(Default)" provider label context-aware, showing which provider it resolves to or "(Default - none configured)"
  • Show overlay feedback in dictation mode when a profile's prompt action is skipped due to missing provider

Test plan

  • Create a prompt action with no provider override and no global default - executing should show a warning MessageBox
  • Verify the default provider dropdown shows the resolved provider name (e.g. "(Default - OpenAI / gpt-4o-mini)")
  • Verify the prompt editor now has a "Provider" dropdown to set per-action overrides
  • Verify per-action provider override persists after save and reload

@SeoFood SeoFood merged commit e9268a9 into main Apr 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Prompt actions require manual LLM provider configuration per action

1 participant