-
Notifications
You must be signed in to change notification settings - Fork 10.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug: --chat-template seems to be broken now, no way to truly chat from the llama-cli #8053
Comments
I'm configuring the prompt as suggested by this comment #6747 (comment), and it's worked pretty well. |
@Deputation I agree, the same issue I see on my end with llama3-instruct 8b. I have been told to use "right" prompt style but even with Llama3 prompt style it gives ok response maybe once then just give random garbage in llama-cli. The same issue with the llama-server. Version B3080 works fine after that no luck. Also when ctx size get exceeded from multiple questions, model just stops generating altogether. I am still waiting for @ggerganov and team on this issue. Screencast from 2024-06-21 11:10:00 AM.webm |
@Deputation try running llama-cli like this:
And then chat like so: <|im_start|>user
I do not know why it works but it does, setting --chat-template to llama3 and using the correct reverse prompt for it does not work as expected in my hands. :( You can try this out with llama.cui test branch: https://github.com/dspasyuk/llama.cui/tree/test |
What happened?
As per discussions:
#7837
#8009
It seems to be impossible to chat with llama3 8b properly. I have not tested this on 70b models but even in the server UI the model just starts making notes to itself and output garbage / training data as to how it should converse instead of actually conversing. Has something happened to the --chat-template chatml parameter? Even when the CLI is set to output special tokens, I do not see the ChatML tokens coming out.
Name and Version
version: 3158 (5239925)
What operating system are you seeing the problem on?
Linux
Relevant log output
No response
The text was updated successfully, but these errors were encountered: