Conversation
There was a problem hiding this comment.
Codex Review: Here are some suggestions.
Reply with @codex fix comments to fix any unresolved comments.
About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".
| # 현재 설치된 모델 목록을 가져와 선택 박스에 표시 | ||
| available_models = [m["name"] for m in ollama.list().get("models", [])] | ||
|
|
||
| gen_model = st.selectbox("생성 모델", options=available_models) if available_models else st.text_input( | ||
| "생성 모델 이름" | ||
| ) | ||
| embed_models = [m for m in available_models if "embed" in m] |
There was a problem hiding this comment.
[P1] Guard ollama.list() when the daemon is unavailable
The script calls ollama.list() during app initialization and immediately subscripts its result. When the Ollama daemon is not installed or running (a common scenario on first launch), this call raises a connection error before Streamlit can render anything, leaving users with a crash instead of a helpful message or a fallback. Wrapping the call in a try/except and surfacing a warning would make the app usable even when models are not yet available.
Useful? React with 👍 / 👎.
Summary
streamlit_multi_pdf_rag.pydemonstrating multi-PDF retrieval-augmented QA using OpenAI modelsollama pullTesting
python -m py_compile apps/openai_pdf_rag/streamlit_multi_pdf_rag.pypytest(fails: missing modules and Python 2 syntax in existing tests)https://chatgpt.com/codex/tasks/task_e_68b5320a3cd48331ac4f053b8be70da4