Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multilingual-support microservice #259

Open
wants to merge 18 commits into
base: main
Choose a base branch
from

Conversation

siddhivelankar23
Copy link

No description provided.

```python
@traceable(run_type="multilingual-support")
@register_statistics(names=["opea_service@multilingual_support"])
def multilingual(input: LLMParamsDoc) -> Union[LLMParamsDoc, ChatCompletionRequest, SearchedDoc]:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LLMParamsDoc only contains user's query, where do you plan to store the RAG's response?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The input should be GeneratedDoc (to contain both input query and llm response) - I have updated the rfc accordingly

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not ensure the LLM generates output in the same language as the query during the first LLM inference, instead of introducing an additional microservice? Adding another service significantly increases deployment costs. It would be more efficient to select an LLM within the RAG pipeline that supports both input and output languages. Are there specific application scenarios that justify this approach?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We tested a lot of models that support multiple languages but noticed significant tradeoff between accuracy of answers and multilingual support. Most of these modes would not respond accurately in all the cases in languages other than English and hence we set up this microservice for our purpose and now want to upstream it to OPEA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants