Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor LLM text generation native comps #1151

Merged
merged 1 commit into from
Jan 16, 2025

Conversation

XinyaoWa
Copy link
Collaborator

@XinyaoWa XinyaoWa commented Jan 15, 2025

Description

Serve as part work of LLM text generation code refactor.
Remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native.

What not included in this PR and will refine later: README, docker_compose, OpenAI Compatiable API

Issues

List the issue or RFC link this PR is working on.
#998

Type of change

List the type of change like below. Please delete options that are not relevant.

  • Others (enhancement, documentation, validation, etc.)

Dependencies

optimum-habana

Tests

tests/llms/test_llms_text-generation_native_on_intel_hpu.sh

@XinyaoWa XinyaoWa force-pushed the refactor_llm_native branch 2 times, most recently from ae56ff7 to 0c74bf5 Compare January 16, 2025 03:15
Part work of code refactor to combine different text generation backends, remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native.

Add feature for issue opea-project#998

Signed-off-by: Xinyao Wang <[email protected]>
@XinyaoWa XinyaoWa force-pushed the refactor_llm_native branch from 4f70d92 to e9ea0ae Compare January 16, 2025 05:21
@XinyaoWa XinyaoWa merged commit 6d07a06 into opea-project:main Jan 16, 2025
15 of 16 checks passed
smguggen pushed a commit to opea-aws-proserve/GenAIComps that referenced this pull request Jan 23, 2025
Part work of code refactor to combine different text generation backends, remove duplcated native langchain and llama_index folder, consice the optimum habana implementation as a native integration OPEATextGen_Native.

Add feature for issue opea-project#998

Signed-off-by: Xinyao Wang <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants