Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multilingual-support microservice #259

Open
wants to merge 18 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added community/rfcs/assets/multilingual-support.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
40 changes: 40 additions & 0 deletions community/rfcs/multilingual-support.md
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not ensure the LLM generates output in the same language as the query during the first LLM inference, instead of introducing an additional microservice? Adding another service significantly increases deployment costs. It would be more efficient to select an LLM within the RAG pipeline that supports both input and output languages. Are there specific application scenarios that justify this approach?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We tested a lot of models that support multiple languages but noticed significant tradeoff between accuracy of answers and multilingual support. Most of these modes would not respond accurately in all the cases in languages other than English and hence we set up this microservice for our purpose and now want to upstream it to OPEA.

Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# Multilingual-support microservice

## Author

[Siddhi Velankar](https://github.com/siddhivelankar23)

## Status

Under review

## Objective

The objective of this RFC is to propose the creation of a separate multilingual-support microservice to handle translation that can be seamlessly used with other microservices. Currently, translation exists as a separate megaservice, which makes it difficult to integrate with other standalone microservices or applications. This proposal aims to simplify and standardize the process of multilingual support by creating a dedicated, scalable, and easily integrable microservice that can seamlessly interact with other services within the system architecture. The goal is to provide a flexible, centralized approach to translation and enhancing the user experience across various languages.

## Motivation

In the current system, translation is managed by a monolithic megaservice, which is challenging to integrate with other microservices.
By introducing multilingual support as a standalone microservice, we aim to decouple the translation functionality from other services, making it easier to scale and manage.


## Design Proposal

![Multilingual support microservice](./assets/multilingual-support-diagram.png)


The proposed architecture involves the creation of a new microservice called multilingual-support.
This microservice does the following -
1. Detects the language of the user's query as well as the response from the first llm microservice.
2. Configures a translation prompt to convert the answer from the response language to the query language.
3. This prompt is sent to the second llm microservice to generate the final answer.

This ensures seamless, accurate communication across different languages in real time.


Signature of multilingual-support microservice
```python
@traceable(run_type="multilingual-support")
@register_statistics(names=["opea_service@multilingual_support"])
def multilingual(input: GeneratedDoc) -> Union[LLMParamsDoc, ChatCompletionRequest, SearchedDoc]:
```