Skip to content

fix: prompt formation bug#5

Open
kumaranant1 wants to merge 1 commit intoaleflabo:mainfrom
kumaranant1:bug-fix
Open

fix: prompt formation bug#5
kumaranant1 wants to merge 1 commit intoaleflabo:mainfrom
kumaranant1:bug-fix

Conversation

@kumaranant1
Copy link
Copy Markdown

@kumaranant1 kumaranant1 commented Apr 3, 2026

# ! Don't know why it was inside the loop
# >>>
prompt_builder = load_data(f"{CONTEXT_PROMPT_PATH}/context_prompt.json")
init = prompt_builder[prompt_context]["init"]
if remove_toySequence:
prompt_ = f"{prompt}{init} {toy_class}\n"
else:
prompt_ = f"{prompt}{init} {toy}\n"
# <<<

Right now, the prompts formed are getting appended to the previous prompts, leading to this behaviour:

For the anticipation of the first symbol, the LLM gets this prompt:

<in-context-prompts>
Sequence type: crane
Input Sequence:
 -1
Next Symbol:

But for the anticipation of the second symbol, the LLM gets this prompt:

<in-context-prompts>
Sequence type: crane
Input Sequence:
 -1
Next Symbol:
Input Sequence:
 -1, attach-wheel
Next Symbol:

However, the expected behaviour while anticipating the second symbol should be this:

<in-context-prompts>
Sequence type: crane
Input Sequence:
 -1, attach-wheel
Next Symbol:

Subsequently, while anticipating the third symbol, the LLM gets this prompt:

<in-context-prompts>
Sequence type: crane
Input Sequence:
 -1
Next Symbol:
Input Sequence:
 -1, attach-wheel
Next Symbol:
Input Sequence:
 -1, attach-wheel, attach-grill
Next Symbol:

However, the expected behaviour while anticipating the third symbol should be this:

<in-context-prompts>
Sequence type: crane
Input Sequence:
 -1, attach-wheel, attach-grill
Next Symbol:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant