Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Warn user if a context size greater than 2048 tokens is specified #274

Merged
merged 1 commit into from
Mar 19, 2023

Conversation

Ronsor
Copy link
Contributor

@Ronsor Ronsor commented Mar 18, 2023

LLaMA doesn't support more than 2048 token context sizes, and going above that produces terrible results. Still allow larger sizes, but print a warning so users don't suspect a bug in the software instead.

LLaMA doesn't support more than 2048 token context sizes, and going above that produces terrible results.
@j-f1 j-f1 merged commit d7def1a into ggerganov:master Mar 19, 2023
@j-f1
Copy link
Collaborator

j-f1 commented Mar 19, 2023

Thank you!

Deadsg pushed a commit to Deadsg/llama.cpp that referenced this pull request Dec 19, 2023
low_level_api_chat_cpp.py: Fix missing antiprompt output in chat.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants