Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update LLM microservice output to binary #123

Merged
merged 1 commit into from
May 31, 2024

Conversation

letonghan
Copy link
Collaborator

Description

Update LLM microservice output to binary to adapt frontend format.
The output format of all related examples using LLM microservice will be changed.

Issues

n/a

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds new functionality)
  • Breaking change (fix or feature that would break existing design and interface)

Dependencies

None

Tests

Local tested & integration tested with frontend.

@lvliang-intel lvliang-intel merged commit 3448b6f into opea-project:main May 31, 2024
6 checks passed
ganesanintel pushed a commit to ganesanintel/GenAIComps that referenced this pull request Jun 3, 2024
XinyuYe-Intel pushed a commit that referenced this pull request Jun 17, 2024
lkk12014402 pushed a commit that referenced this pull request Aug 8, 2024
@letonghan letonghan deleted the llm/output_binary branch September 23, 2024 03:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants