You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The user wants to compare the answers from multiple LLMs. To facilitate this we could send each query into more thatn one LLMs simultaneously. We can do that by allowing the user to select each LLMs of those available in tha app, their query can be sent to. Task #204 which is the fronted task for this task will implement a checkbox inside the dropdown which will allow the user to select the LLMs. After the user has selected the active LLMs and they make a query, this query shoudl be fed to all LLMs and their answers should be parsed and presented in parallel to the frontend caht bubble compoents that are build in #204.
This task requires collaboration with the person building task #204
User Story
As a user,
I want to select multiple LLMs and receive their responses in parallel chat bubbles,
so that I can analyze the differences in their answers.
Acceptance Criteria
After the user has selected the LLMs they require answer from, the queries are sent to all of them.
Each of the replies coming from different LLMs of each LLM is presented in one subsection of the chat bubble compoent which is distinct from the replies coming from the other LLMs.
Definition of Done
The feature has been fully implemented.
The feature has been manually tested and works as expected without critical bugs.
The feature code is documented with clear explanations of its functionality and usage.
The feature code has been reviewed and approved by at least one team member.
The feature branches have been merged into the main branch and closed.
The feature utility, function and usage have been documented in the respective project wiki on github.
The text was updated successfully, but these errors were encountered:
tubamos
changed the title
Develop the backend for querrying multiple LLMs simultaneously
Query and display multiple LLM responses simultaneously - backend
Jun 19, 2024
tubamos
added
to edit
items that need a bit more work probably after some sicussion has been made about them
domain: po
Items that are related to Product Owner Tasks
labels
Jun 19, 2024
tubamos
removed
domain: po
Items that are related to Product Owner Tasks
to edit
items that need a bit more work probably after some sicussion has been made about them
labels
Jun 21, 2024
Dependencies
This is the backend brother of the frontend task:
Both of them also rely on:
Domain
app backend
Description
The user wants to compare the answers from multiple LLMs. To facilitate this we could send each query into more thatn one LLMs simultaneously. We can do that by allowing the user to select each LLMs of those available in tha app, their query can be sent to. Task #204 which is the fronted task for this task will implement a checkbox inside the dropdown which will allow the user to select the LLMs. After the user has selected the active LLMs and they make a query, this query shoudl be fed to all LLMs and their answers should be parsed and presented in parallel to the frontend caht bubble compoents that are build in #204.
This task requires collaboration with the person building task #204
User Story
Acceptance Criteria
Definition of Done
The text was updated successfully, but these errors were encountered: