Replies: 2 comments
-
https://tabby.tabbyml.com/docs/references/models-http-api/ollama/ shall be a good place to put such information |
Beta Was this translation helpful? Give feedback.
0 replies
-
Indeed, thank you. Now, where would I find the authoritative information in the specific case of
from here but I'm unsure of the original provenience. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Please describe the feature you want
When offloading the llm inference to
ollama
for Fill In the Middle tasks, we have to add an option like:(I discovered this line, for the deepseek-coder-v2 model, from this issue).
But as I want to also try out the
codestral
like of models, I don't know what that prompt would be in that case.Since the architectures are not a lot, it would be nice to have the
prompt_templates
for FIM for the major architectures listed in the readme.Please reply with a 👍 if you want this feature.
Beta Was this translation helpful? Give feedback.
All reactions