Skip to content

Comments

Feature/fixing llm long prompt and model settings#2

Open
ayessipovskiyDAG wants to merge 2 commits intobigx333:mainfrom
ayessipovskiyDAG:feature/fixing-llm-long-prompt-and-model-settings
Open

Feature/fixing llm long prompt and model settings#2
ayessipovskiyDAG wants to merge 2 commits intobigx333:mainfrom
ayessipovskiyDAG:feature/fixing-llm-long-prompt-and-model-settings

Conversation

@ayessipovskiyDAG
Copy link

@ayessipovskiyDAG ayessipovskiyDAG commented Feb 22, 2026

feat:

  • Handle null/undefined facet properties in mergeCountMaps to prevent crashes
  • Pass prompts via stdin instead of command args to avoid length limits
  • Make ANALYSIS_MODEL respect CODEX_INSIGHTS_MODEL env variable
  • Remove unused imports
  • Replace OPENAI_API_KEY hard gate with non-dry-run auth preflight
  • Fail fast with clear errors when Codex CLI is missing or not logged in
  • Keep --dry-run behavior unchanged (no auth check, no LLM calls)
  • Document full-run vs dry-run auth behavior in README

  - replace `OPENAI_API_KEY` hard gate with non-dry-run auth preflight
  - fail fast with clear errors when Codex CLI is missing or not logged in
  - keep `--dry-run` behavior unchanged (no auth check, no LLM calls)
  - document full-run vs dry-run auth behavior in README
   - Handle null/undefined facet properties in mergeCountMaps to prevent crashes
   - Pass prompts via stdin instead of command args to avoid length limits
   - Make ANALYSIS_MODEL respect CODEX_INSIGHTS_MODEL env variable
   - Remove unused imports
@ayessipovskiyDAG
Copy link
Author

I like this section 😂
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant