We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ggerganov/llama.cpp#904 以交互方式将加载的模型导出到二进制文件 #904 打开 Jon-Chuang 打开了本期 on Apr 12 ·2 评论 评论 @jon创 贡献 钟创评论 on Apr 12 • 对于 https://github.com/ggerganov/llama.cpp/pull/820,加载的模型可能与基本模型不同。能够以交互方式将当前加载的模型导出到 binfile 是有意义的。
特别是如果允许线性插值多个 LoRA 文件的选项 - 即 LoRA 调酒学以获得独特的 LLM 个性.
@MillionthOdin16 百万之奥丁16评论 on Apr 12 如果您熟悉 loras 的混合,我认为如果您可以在上面链接一些资源,这对这里的很多人都会有所帮助。我听说你可以做一些很酷的事情,但我不是很熟悉。
The text was updated successfully, but these errors were encountered:
No branches or pull requests
ggerganov/llama.cpp#904
以交互方式将加载的模型导出到二进制文件 #904
打开
Jon-Chuang 打开了本期 on Apr 12 ·2 评论
评论
@jon创
贡献
钟创评论 on Apr 12 •
对于 https://github.com/ggerganov/llama.cpp/pull/820,加载的模型可能与基本模型不同。能够以交互方式将当前加载的模型导出到 binfile 是有意义的。
特别是如果允许线性插值多个 LoRA 文件的选项 - 即 LoRA 调酒学以获得独特的 LLM 个性.
@MillionthOdin16
百万之奥丁16评论 on Apr 12
如果您熟悉 loras 的混合,我认为如果您可以在上面链接一些资源,这对这里的很多人都会有所帮助。我听说你可以做一些很酷的事情,但我不是很熟悉。
The text was updated successfully, but these errors were encountered: