-
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
having this error #1019
Comments
hi @hiqsociety. Thanks for your feedback. May you show us more information like which version of localAI you are using. And may you please replace the title to more related to the issue? |
Hi @Aisuko I'm encountering the same issue. Not quite sure where to check the version, but did a git clone at I've used your Example: Use GPT4ALL-J model with guide. This is the logs from the docker container:
The connection error shows up after sending the |
I'm getting the same issue. The lunademo works, but i get these responses almost all the time with other models (even low RAM ones) |
I am also getting those errors. The version from a month ago did not show that. Is like if it had some kind of RPC server running by default even when the line specific for GPRC was commented. Maybe some code error? |
I've been dabbling with this a little bit, and my still uneducated guess is that you haven't set the THREADS count or haven't specified your GPU count composition (SINGLE_ACTIVE_BACKEND=true if only one is available). Setting THREADS to the vCPU count was what I needed in my case. Intuitive, but the defaults aren't very verbose so it took a bit of trial & error to figure that out. |
As for me, change the port in .env/Dockerfile solve the problem. Still wondering why... |
having this error, how to resolve?
![Screenshot from 2023-09-07 02-04-59](https://private-user-images.githubusercontent.com/51492452/266182751-9561af59-69d4-4763-a2b3-ca33a3208409.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3Mzk0MTgyMDIsIm5iZiI6MTczOTQxNzkwMiwicGF0aCI6Ii81MTQ5MjQ1Mi8yNjYxODI3NTEtOTU2MWFmNTktNjlkNC00NzYzLWEyYjMtY2EzM2EzMjA4NDA5LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNTAyMTMlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjUwMjEzVDAzMzgyMlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTVhM2ZmMTNhNDdkYjZjMGE0MzM2YjYxNmFjMDRkOTlkN2ZhYzVmN2RmNGY3NjAzMDEyMjUwZWZjMDA3YjE1NTgmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0In0.XH23kl_zOrQwIbdKt37YYlC5MNhC5Ha0lN9yLUvfPG0)
The text was updated successfully, but these errors were encountered: