Skip to content

Conversation

@kazord
Copy link
Contributor

@kazord kazord commented Apr 20, 2023

Was hard for me to merge, but got it working had to revert the vec on interactive stop, but the model look like to handle it well (at least on my tests)

i used something like that
.\rllama.exe --f16 --param-path C:\LLaMA\vicuna-7b\config.json --model-path C:\LLaMA\vicuna-7b\ --tokenizer-path C:\LLaMA\tokenizer.model --start-interactive --interactive-system-prompt "A chat between a curious human and an artificial intelligence assistant.
The assistant gives helpful, detailed, and polite answers to the human's questions.
###Human:" --interactive-prompt-postfix "
###Assistant:" --interactive-stop "
###Human: "
for testing.

also have --show-interactions to display user token generation

Hope you like it :)
(for next , it was to store/restore chat session, but need to find what data i have to save/load, if you can help me on that)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant