'prompt_template' not work when using openai/completion #3323
-
When use 'openai/completion' for FIM, found that prompt_template not work. My ~/.tabby/config.toml is
But the server got
when I use IDE code completion. It seems that prompt_template not work, since the prompt only contain prefix. Meanwhile the "suffix" param cause 400 Bad request. I have no idea how to fix that. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
I've tried sglang for the replacement of openai server, but it does not work either. it gets an error before sending request:
|
Beta Was this translation helpful? Give feedback.
-
The llama.cpp server worked. But for more efficient in our system, Do you have plan to:
Any progress in these aspects will be highly beneficial to us. Looking forward to your prompt reply. Thank you. |
Beta Was this translation helpful? Give feedback.
Thank you for exploring the various backends for testing with Tabby.
Regarding your questions:
Feel free to ask if you need any further modifications!