-
Notifications
You must be signed in to change notification settings - Fork 145
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
bug: Model failed to load with status code: 500
, could not load engine llamacpp
#1422
Comments
v154 - also getting this error Workaround
which should have been pre-installed. cc @hiento09 - discord discussion |
Model failed to load with status code: 500
, could not load engine llamacpp
should be solved by #1369? |
Copying's Hien's post from #1396
|
Proposal: Investigating -> Feature (high pain) -> Eng Planning
|
Decision from Dan / Nicole:
|
Closing this bug as resolved, |
Goal
Current State
Running llama3.1 failed with model status code 500:
The real error message is in the logs:
The text was updated successfully, but these errors were encountered: