Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chatqna-xeon-backend-server crashes due to ports not defined #652

Closed
liuyingshanx opened this issue Aug 22, 2024 · 3 comments
Closed

chatqna-xeon-backend-server crashes due to ports not defined #652

liuyingshanx opened this issue Aug 22, 2024 · 3 comments
Assignees
Labels

Comments

@liuyingshanx
Copy link

What I did:
I followed https://github.com/opea-project/GenAIExamples/blob/main/ChatQnA/docker/xeon/README_qdrant.md and did all steps to set up chatQnA via docker compose using qdrant as vector db. I Rebuild all images in the latest main branch(Aug. 22, ac324a9 is the latest commit id) including opea/chatqna:latest, and apply docker compose yaml file.

What happens
Mega container chatqna-xeon-backend-server crashes, the error was:
image

How I workaround
Before applying docker compose yaml file, set those envs:

export EMBEDDING_SERVICE_PORT=6044
export RETRIEVER_SERVICE_PORT=6045
export RERANK_SERVICE_PORT=6046
export LLM_SERVICE_PORT=6047

@louie-tsai
Copy link
Collaborator

louie-tsai commented Sep 3, 2024

@liuyingshanx
There is a fix for exporting the ports into docker instance in below PR
#569
image

Could you try again with the latest codes?

thanks

@louie-tsai
Copy link
Collaborator

@liuyingshanx
if no issue any more, we will close the ticket accordingly.

@louie-tsai
Copy link
Collaborator

feel free to reopen it if issue still exists

wangkl2 pushed a commit to wangkl2/GenAIExamples that referenced this issue Dec 11, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants