Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Hope to support the embedding and reranker interfaces of the text embedding inference framework #2101

Open
linfengjjj opened this issue Jan 26, 2025 · 2 comments
Assignees

Comments

@linfengjjj
Copy link

MaxKB Version

v1.9.1

Please describe your needs or suggestions for improvements

希望模型管理可以支持对接text-embeddings-inference推理框架的embeding和reranker接口
1、框架地址:https://github.com/huggingface/text-embeddings-inference
2、embeding和reranker接口示例

Image

Image

Please describe the solution you suggest

No response

Additional Information

No response

@shaohuzhang1 shaohuzhang1 changed the title [Feature] 希望支持对接text embedding inference框架的embeding和reranker接口 [Feature] Hope to support the embedding and reranker interfaces of the text embedding inference framework Jan 26, 2025
@liuruibin
Copy link
Member

You can add it to the system after deploying vllm or xinference

@linfengjjj
Copy link
Author

Thanks for your reply.
Compared to the vllm and xinference, it focuses more on text embedding task optimization, deeply optimized the inference process of the embedded model, supporting high concurrency and low latency request processing, and more mature hardware acceleration for embedded tasks.
This is the reason why the potential customer are so demanding, hope you can consider this requirement, sincerely.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants