Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请教一下用open-api-server跟inference-hf 回答问题不一致的问题 #49

Open
ibmxiang opened this issue Aug 8, 2023 · 0 comments

Comments

@ibmxiang
Copy link

ibmxiang commented Aug 8, 2023

对应的脚本为:
python310 scripts/inference/inference_hf.py
--base_model /root/chinese-alpaca-2-7b
--with_prompt
--interactive

python310 scripts/openai_server_demo/openai_api_server.py --base_model /root/chinese-alpaca-2-7b --gpus 0

感觉两个回答的问题完全不一样,一个类似于gpt一个类似于国内某些大模型。
在api接口调用时 :
生成内容为:'text': '1. 九绵高速气势雄伟,如巨龙盘旋山间。\n2. 收尾阶段,2024年全线通车,将带动沿线发展。\n3. 九绵高速施工难度大,桥隧比高。\n4. 绵阳广播电视台融媒体中心报道,深入了解九绵高速建设。\n5. 冯梦晗、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平、廖琪平

在本地推理的时候生成的为:Response:

  1. 九绵高速气势雄伟!
  2. 收尾阶段,期待通车!
  3. 助力沿线发展,意义重大!
  4. 建筑奇迹,令人惊叹!
  5. 工程之美,值得铭记!
  6. 交通便利化,带来新机遇!
  7. 绵阳人民翘首期盼!
  8. 宏伟规划,未来可期!
  9. 技术创新,成就辉煌!
  10. 建设成果,值得骄傲!
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant