Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workaround for litellm proxy -> ollama issue #377

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

clevcode
Copy link

When streaming responses from a model served by ollama, through a litellm proxy that provides an OpenAI compatible API, the last chunk (i.e. choice["delta"]["content"]) of the streaming response will be set to None

This causes an exception to be thrown during the following, since None can't be appended to a string:
content += choice["delta"]["content"]

To avoid this, I simply added a check so that choice["delta"]["content"] is only appended to content if it's not None.

up an OpenAI API compatible proxy to ollama. In this case, the last
chunk was set to None, which throws an exception when we try
appending it to the content string.
@clevcode
Copy link
Author

I've researched the issue more now, and it's the very last chunk, containing the finish_reason field, that in the case of at least LiteLLM:s implementation of the OpenAI API, also sets the 'delta' field to {content: null, role: null} rather than just not setting the delta field at all in this case.

I think it's a good idea for this project to be resilient to that type of slight deviation from how the real OpenAI endpoints might respond, but I also think it should be fixed upstream in the litellm repo. I've already made some pull requests for a couple of other issues there when using litellm with ollama.

@cmungall
Copy link
Contributor

@simonw - any chance of getting this PR merged? I think it makes sense for llm to be defensive in this way. I'm currently vendoring my llm as I really need this change.

@clevcode - do you have plans to add a fix that eliminates the Nones to litellm directly? Lmk if I can help with this

@simonw
Copy link
Owner

simonw commented Jan 27, 2024

Sorry for not getting this into 0.13 - this will need some tweaks now that I've upgraded the OpenAI Python library.

@krrishdholakia
Copy link

Hey @clevcode i'm the maintainer of litellm

Tracking this issue on our end as well - BerriAI/litellm#2010

Thanks for raising it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants