Skip to content

Conversation

@shiverin
Copy link
Contributor

@shiverin shiverin commented Feb 4, 2026

Description

Refactors the SICP chatbot backend to enforce one conversation per user and moves context building from init-time to chat-time for fresh, relevant responses.

Motivation: Previously, the system prompt (section summary + visible text) was built and stored permanently at conversation creation. This caused issues when users scrolled to different paragraphs, which caused the the chatbot's context became stale and irrelevant.

Changes:

Before After
Multiple conversations per user One conversation per user
Context stored at init_chat Context built fresh at each chat call
POST /chats/:conversationId/message POST /chats/message (no ID needed)

Files changed:

  • lib/cadet/chatbot/llm_conversations.ex - Added get_or_create_conversation/1, removed section/text params from conversation creation
  • lib/cadet_web/controllers/chat_controller.ex - init_chat now gets/creates conversation, chat receives section + context dynamically
  • lib/cadet_web/router.ex - Changed route from /chats/:conversationId/message to /chats/message
  • lib/cadet_web/views/chat_view.ex - Returns full messages array instead of last_message

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update
  • Code quality improvements

How to test

  1. Copy config/dev.secrets.exs.example to config/dev.secrets.exs
  2. Add OpenAI configuration:
config :openai,
  encryption_key: "<your_encryption_key>",
  api_key: "<your_openai_api_key>",
  http_options: [recv_timeout: 30_000, timeout: 30_000]
  1. Authenticate as a user
  2. POST /v2/chats - should return existing conversation or create new one
  3. POST /v2/chats/message with {message, section, initialContext} - should return GPT response
  4. POST /v2/chats again - should return same conversation with full history

Checklist

  • I have tested this code
  • I have updated the documentation

Comment on lines 63 to 67
def chat(conn, %{
"message" => user_message,
"section" => section,
"initialContext" => visible_text
}) do
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: The chat function crashes with a 500 error if required parameters are missing, instead of returning a 400 Bad Request as documented.
Severity: HIGH

Suggested Fix

Add a fallback function clause for the chat/2 function that catches any parameter mismatches. This clause should send a 400 Bad Request response with an informative error message, similar to the pattern used in answer_controller.ex:

def chat(conn, _params) do
  send_resp(conn, :bad_request, "Missing or invalid parameter(s)")
end
Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: lib/cadet_web/controllers/chat_controller.ex#L63-L67

Potential issue: The `chat` function in `ChatController` uses strict pattern matching
for the `message`, `section`, and `initialContext` parameters. If a request is sent
without one of these required parameters, the function clause will not match, raising a
`FunctionClauseError`. This results in a 500 Internal Server Error. The intended
behavior, as documented in the API specification and demonstrated in other controllers
within the codebase, is to return a 400 Bad Request for missing or invalid parameters.
The current implementation lacks a fallback function clause to handle such cases
gracefully.

Did we get this right? 👍 / 👎 to inform future reviews.

Comment on lines 47 to +57

Conversation
|> where([c], c.user_id == ^user_id)
|> Repo.all()
end
case get_conversation_for_user(user_id) do
{:ok, conversation} ->
Logger.info("User #{user_id} already has conversation #{conversation.id}")
{:ok, conversation}

@spec create_conversation(binary() | integer(), binary(), binary()) ::
{:error, binary()} | {:ok, Conversation.t()}
def create_conversation(user_id, section, visible_paragraph_texts)
when is_ecto_id(user_id) and is_binary(section) and is_binary(visible_paragraph_texts) do
Logger.info("Creating a new conversation for user #{user_id} in section #{section}")
{:error, {:not_found, _}} ->
Logger.info("Creating new conversation for user #{user_id}")
create_new_conversation(user_id)
end
end
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: A race condition in get_or_create_conversation can create duplicate conversations for the same user because there is no unique database constraint on user_id.
Severity: HIGH

Suggested Fix

Add a unique index on the user_id column in the llm_chats table via a new migration. Then, update the Conversation changeset to use unique_constraint/2 to handle potential constraint violations gracefully. This will enforce the one-conversation-per-user invariant at the database level, preventing race conditions.

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: lib/cadet/chatbot/llm_conversations.ex#L45-L57

Potential issue: The `get_or_create_conversation` function implements a check-then-act
pattern to find or create a user's conversation. It first queries for an existing
conversation and, if none is found, proceeds to insert a new one. However, the `user_id`
column in the `llm_chats` table does not have a unique constraint. This creates a race
condition where two simultaneous requests for the same new user can both determine that
no conversation exists and then both successfully insert a new record. This violates the
intended invariant of one conversation per user, leading to orphaned data in the
database.

Did we get this right? 👍 / 👎 to inform future reviews.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The race condition is beacause user_id in llm_chats isn't unique ( as per 20240714233659_create_llm_chats_table.exs). If two requests come in, both could create a new chat at the same time.

We can run a migration to enforce user_id to be unique which allows the database to block the second request to initliase a new chat. But the migrations may be risky because some users may already have multiple conversations in the llm_chats table, and we will need to clean them up. :(

Comment on lines +79 to +80
system_prompt <- Cadet.Chatbot.PromptBuilder.build_prompt(section, visible_text),
payload <- generate_payload(updated_conversation, system_prompt) do
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: The chat function crashes if the initialContext parameter is null, as it's passed directly to a string concatenation operation without validation.
Severity: MEDIUM

Suggested Fix

Add validation within the chat controller to ensure the initialContext parameter is a string before it is passed to the build_prompt function. A guard clause or a case statement can be used to check the type of visible_text and return a 400 Bad Request if it is nil or not a binary.

Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.

Location: lib/cadet_web/controllers/chat_controller.ex#L79-L80

Potential issue: The `chat` function passes the `initialContext` parameter to
`Cadet.Chatbot.PromptBuilder.build_prompt`, where it is used in a string concatenation
operation. If a client sends a request where the `initialContext` key is present but its
value is `null`, the pattern match in the controller will succeed, but the
`build_prompt` function will attempt to concatenate a string with `nil`. This will raise
a runtime error (`ArgumentError: expected binary, got: nil`), causing an unhandled crash
and a 500 Internal Server Error. The function should validate that `initialContext` is a
non-nil string before use.

Did we get this right? 👍 / 👎 to inform future reviews.

@coveralls
Copy link

coveralls commented Feb 5, 2026

Coverage Status

coverage: 89.178% (+0.4%) from 88.739%
when pulling fe1009e on shiverin:feature/chatbot-one-conversation-per-user
into 90d6931 on source-academy:master.

@RichDom2185 RichDom2185 changed the title Merge chatbot feature: one-conversation-per-user logic Chatbot feature: one-conversation-per-user logic Feb 12, 2026
Copy link
Member

@RichDom2185 RichDom2185 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks!

@RichDom2185 RichDom2185 merged commit f6849af into source-academy:master Feb 12, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants