Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

function call问题 #389

Closed
Yuang-Deng opened this issue Dec 12, 2024 · 1 comment
Closed

function call问题 #389

Yuang-Deng opened this issue Dec 12, 2024 · 1 comment

Comments

@Yuang-Deng
Copy link

了解到目前使用trainablemodel时,tools选择在内部处理,使用onlinechatmodel时,tools选择交给api处理。
当自己部署模型作为api时,选择onlinechatmodel会导致报错
image
看起来像是vllm部署的原因

实际工作中,一般都会私有化部署一个大模型的api,然后使用onlinechatmodel,如果部署的api不支持tools选择,那么就没法进行functioncall。是否可以在不改变已经部署大模型api的条件下,通过参数控制tools的处理逻辑呢。

@wzh1994
Copy link
Contributor

wzh1994 commented Dec 12, 2024

这里建议使用TrainableModule部署一个大模型服务,再通过TrainableModule().deploy_method(xx, url=xx)去连接

@wzh1994 wzh1994 closed this as completed Dec 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants