Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Can't connect to LM Studio on https://get.big-agi.com/ #467

Closed
1 task
rikyotei opened this issue Mar 28, 2024 · 3 comments
Closed
1 task

[BUG] Can't connect to LM Studio on https://get.big-agi.com/ #467

rikyotei opened this issue Mar 28, 2024 · 3 comments
Labels
type: bug Something isn't working

Comments

@rikyotei
Copy link

Description

图片
I got this error on https://get.big-agi.com/
Can you please take look this?

This problem does not exist when I run the project locally. By the way, i have already enabled CROS. I think it is not a CROS problem.

I have also run debug in broswer, there is no any request to http://localhost:1234 when i click models button.

Device and browser

windows 11, chrome

Screenshots and more

No response

Willingness to Contribute

  • 🙋‍♂️ Yes, I would like to contribute a fix.
@rikyotei rikyotei added the type: bug Something isn't working label Mar 28, 2024
@enricoros
Copy link
Owner

Yes the issue here (fully expected) is that it is the Backend of Big-AGI that's trying to connect to LM Studio.

When you run locally, the front end and back end are both running on localhost. But when you run that official website the backend is running on Vercel servers, while the frontend is on the browser (localhost).

So then the back end, which is running in some server somewhere will be trying to connect to a port on its own "localhost", and it fails because LM Studio is running on your computer.

Please take a look at #276 (comment) for the same issue, with a description and a diagram.

@rikyotei
Copy link
Author

Yes the issue here (fully expected) is that it is the Backend of Big-AGI that's trying to connect to LM Studio.

When you run locally, the front end and back end are both running on localhost. But when you run that official website the backend is running on Vercel servers, while the frontend is on the browser (localhost).

So then the back end, which is running in some server somewhere will be trying to connect to a port on its own "localhost", and it fails because LM Studio is running on your computer.

Please take a look at #276 (comment) for the same issue, with a description and a diagram.

@enricoros Thank you for your help.

@PylotLight
Copy link

Yes the issue here (fully expected) is that it is the Backend of Big-AGI that's trying to connect to LM Studio.
When you run locally, the front end and back end are both running on localhost. But when you run that official website the backend is running on Vercel servers, while the frontend is on the browser (localhost).
So then the back end, which is running in some server somewhere will be trying to connect to a port on its own "localhost", and it fails because LM Studio is running on your computer.
Please take a look at #276 (comment) for the same issue, with a description and a diagram.

@enricoros Thank you for your help.

If you served your lm studio api endpoint publicly via ngrok you might be able to get it to work btw.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants