Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: reasoning_content missing from completion's response #8193

Open
V4G4X opened this issue Feb 2, 2025 · 9 comments · May be fixed by #8431
Open

[Bug]: reasoning_content missing from completion's response #8193

V4G4X opened this issue Feb 2, 2025 · 9 comments · May be fixed by #8431
Assignees
Labels
bug Something isn't working

Comments

@V4G4X
Copy link

V4G4X commented Feb 2, 2025

What happened?

I see Chat UIs (like OpenRouter Chatroom) showing what the reasoning models like R1 are "thinking" before they give their output.
I want to bring that upstream (Aider).

Under verbose logging I could see each reasoning token printed for many minutes before the final output tokens came in.
Referred this to write a basic script to get a feel for what the response structure will be (new to dynamically typed development).

from litellm import completion
import os

os.environ['OPENROUTER_API_KEY'] = "sk-xyz"
resp = completion(
    model="openrouter/deepseek/deepseek-r1",
    messages=[{"role": "user", "content": "Tell me a joke."}],
)

print(resp.choices[0].message.reasoning_content)

Do different providers have different responses?
And does Litellm not support reasoning_content for OpenRouter like when directly calling DeepSeek APIs?

My final goal is to get both reasoning tokens and streaming working together.
So I can see what "Aider" is thinking while I wait.

But I get the following error on running the simple script above:

Relevant log output

❯ python test_reason.py
/Users/varungawande/playground/aider/.aider-dev-py/lib/python3.12/site-packages/pydantic/_internal/_config.py:345: UserWarning: Valid config keys have changed in V2:
* 'fields' has been removed
  warnings.warn(message, UserWarning)
Traceback (most recent call last):
  File "/Users/varungawande/playground/aider/.aider-dev-py/lib/python3.12/site-packages/pydantic/main.py", line 883, in __getattr__
    return pydantic_extra[item]
           ~~~~~~~~~~~~~~^^^^^^
KeyError: 'reasoning_content'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/Users/varungawande/playground/aider/test_reason.py", line 10, in <module>
    print(resp.choices[0].message.reasoning_content)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/varungawande/playground/aider/.aider-dev-py/lib/python3.12/site-packages/pydantic/main.py", line 885, in __getattr__
    raise AttributeError(f'{type(self).__name__!r} object has no attribute {item!r}') from exc
AttributeError: 'Message' object has no attribute 'reasoning_content'

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

litellm==1.60.0

Twitter / LinkedIn details

https://www.linkedin.com/in/varungawande/

@V4G4X V4G4X added the bug Something isn't working label Feb 2, 2025
@krrishdholakia
Copy link
Contributor

@V4G4X I don't think openrouter returns the value as reasoning content.

@jamesbraza this seems similar to your issue re: include_reasoning

How would you want us to handle this?

@V4G4X
Copy link
Author

V4G4X commented Feb 2, 2025

@krrishdholakia

I believe LiteLLM already has the functionality to return reasoning tokens, for OpenRouter+R1, because it is logging it properly.
I simply need help figuring out how to extract it from the completionresponse.

In OpenRouter's chatroom, the reasoning text window is shown to be different than the output one.

Image

That makes sense because when I enable logging for LiteLLM before building Aider,
Those reasoning tokens are returned separately from the content tokens.

This is my log of LiteLLM from Aider:
If you look for reasoning tokens in the following dump you will see the string "Okay, the user is asking".

❯ cat dump | grep "new delta.*reasoning='"
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning='Okay', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=',', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' the', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' user', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' is', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' asking', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
Raw OpenAI Chunk
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning='Okay'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius')

completion obj content:
model_response finish reason 3: None; response_obj={'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk': ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z',
choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning='Okay'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)],
created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'), 'usage': None}
original delta: {'content': None, 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None, 'reasoning': 'Okay'}
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning='Okay', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
model_response.choices[0].delta: Delta(provider_specific_fields={}, refusal=None, reasoning='Okay', content=None, role='assistant', function_call=None, tool_calls=None, audio=None); completion_obj: {'content':
''}
self.sent_first_chunk: False
completion_obj: {'content': ''}, model_response.choices[0]: StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={}, refusal=None, reasoning='Okay', content=None,
role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None), response_obj: {'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk':
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning='Okay'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius'), 'usage': None}
PROCESSED CHUNK POST CHUNK CREATOR: None
PROCESSED CHUNK PRE CHUNK CREATOR: ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant',
tool_calls=None, reasoning=','), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None,
system_fingerprint=None, usage=None, provider='Nebius'); custom_llm_provider: openai

Raw OpenAI Chunk
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=','),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius')

completion obj content:
model_response finish reason 3: None; response_obj={'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk': ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z',
choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=','), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)],
created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'), 'usage': None}
original delta: {'content': None, 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None, 'reasoning': ','}
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=',', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
model_response.choices[0].delta: Delta(provider_specific_fields={}, refusal=None, reasoning=',', content=None, role='assistant', function_call=None, tool_calls=None, audio=None); completion_obj: {'content':
''}
self.sent_first_chunk: False
completion_obj: {'content': ''}, model_response.choices[0]: StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={}, refusal=None, reasoning=',', content=None, role='assistant',
function_call=None, tool_calls=None, audio=None), logprobs=None), response_obj: {'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk':
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=','),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius'), 'usage': None}
PROCESSED CHUNK POST CHUNK CREATOR: None
PROCESSED CHUNK PRE CHUNK CREATOR: ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant',
tool_calls=None, reasoning=' the'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None,
system_fingerprint=None, usage=None, provider='Nebius'); custom_llm_provider: openai

Raw OpenAI Chunk
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' the'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius')

completion obj content:
model_response finish reason 3: None; response_obj={'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk': ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z',
choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' the'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)],
created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'), 'usage': None}
original delta: {'content': None, 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None, 'reasoning': ' the'}
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' the', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
model_response.choices[0].delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' the', content=None, role='assistant', function_call=None, tool_calls=None, audio=None); completion_obj: {'content':
''}
self.sent_first_chunk: False
completion_obj: {'content': ''}, model_response.choices[0]: StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={}, refusal=None, reasoning=' the', content=None,
role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None), response_obj: {'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk':
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' the'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius'), 'usage': None}
PROCESSED CHUNK POST CHUNK CREATOR: None
PROCESSED CHUNK PRE CHUNK CREATOR: ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant',
tool_calls=None, reasoning=' user'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk',
service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'); custom_llm_provider: openai

Raw OpenAI Chunk
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' user'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius')

completion obj content:
model_response finish reason 3: None; response_obj={'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk': ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z',
choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' user'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)],
created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'), 'usage': None}
original delta: {'content': None, 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None, 'reasoning': ' user'}
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' user', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
model_response.choices[0].delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' user', content=None, role='assistant', function_call=None, tool_calls=None, audio=None); completion_obj:
{'content': ''}
self.sent_first_chunk: False
completion_obj: {'content': ''}, model_response.choices[0]: StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={}, refusal=None, reasoning=' user', content=None,
role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None), response_obj: {'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk':
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' user'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius'), 'usage': None}
PROCESSED CHUNK POST CHUNK CREATOR: None
PROCESSED CHUNK PRE CHUNK CREATOR: ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant',
tool_calls=None, reasoning=' is'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None,
system_fingerprint=None, usage=None, provider='Nebius'); custom_llm_provider: openai

Raw OpenAI Chunk
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' is'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius')

completion obj content:
model_response finish reason 3: None; response_obj={'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk': ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z',
choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' is'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)],
created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'), 'usage': None}
original delta: {'content': None, 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None, 'reasoning': ' is'}
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' is', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
model_response.choices[0].delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' is', content=None, role='assistant', function_call=None, tool_calls=None, audio=None); completion_obj: {'content':
''}
self.sent_first_chunk: False
completion_obj: {'content': ''}, model_response.choices[0]: StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={}, refusal=None, reasoning=' is', content=None, role='assistant',
function_call=None, tool_calls=None, audio=None), logprobs=None), response_obj: {'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk':
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' is'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius'), 'usage': None}
PROCESSED CHUNK POST CHUNK CREATOR: None
PROCESSED CHUNK PRE CHUNK CREATOR: ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant',
tool_calls=None, reasoning=' asking'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk',
service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'); custom_llm_provider: openai

Raw OpenAI Chunk
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' asking'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius')

completion obj content:
model_response finish reason 3: None; response_obj={'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk': ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z',
choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' asking'), finish_reason=None, index=0, logprobs=None,
native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None, usage=None, provider='Nebius'), 'usage': None}
original delta: {'content': None, 'function_call': None, 'refusal': None, 'role': 'assistant', 'tool_calls': None, 'reasoning': ' asking'}
new delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' asking', content=None, role='assistant', function_call=None, tool_calls=None, audio=None)
model_response.choices[0].delta: Delta(provider_specific_fields={}, refusal=None, reasoning=' asking', content=None, role='assistant', function_call=None, tool_calls=None, audio=None); completion_obj:
{'content': ''}
self.sent_first_chunk: False
completion_obj: {'content': ''}, model_response.choices[0]: StreamingChoices(finish_reason=None, index=0, delta=Delta(provider_specific_fields={}, refusal=None, reasoning=' asking', content=None,
role='assistant', function_call=None, tool_calls=None, audio=None), logprobs=None), response_obj: {'text': '', 'is_finished': False, 'finish_reason': None, 'logprobs': None, 'original_chunk':
ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant', tool_calls=None, reasoning=' asking'),
finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None, system_fingerprint=None,
usage=None, provider='Nebius'), 'usage': None}
PROCESSED CHUNK POST CHUNK CREATOR: None
PROCESSED CHUNK PRE CHUNK CREATOR: ChatCompletionChunk(id='gen-1738520971-svZjag48VXWwp0RfBi2z', choices=[Choice(delta=ChoiceDelta(content=None, function_call=None, refusal=None, role='assistant',
tool_calls=None, reasoning=' if'), finish_reason=None, index=0, logprobs=None, native_finish_reason=None)], created=1738520971, model='deepseek/deepseek-r1', object='chat.completion.chunk', service_tier=None,
system_fingerprint=None, usage=None, provider='Nebius'); custom_llm_provider: openai

@kuloud
Copy link

kuloud commented Feb 4, 2025

I'm also having this problem. Do you have any ideas on how to deal with this issue?

@krrishdholakia krrishdholakia self-assigned this Feb 4, 2025
@krrishdholakia
Copy link
Contributor

i'll pick this up today - thanks for the work on this @V4G4X

@jamesbraza
Copy link
Contributor

jamesbraza commented Feb 4, 2025

Hi @V4G4X do you know about https://openrouter.ai/docs/api-reference/parameters#include-reasoning? This was added in #8184.

I think you need to do pip install litellm>=1.60.2 then:

response = completion(
    model="openrouter/deepseek/deepseek-r1",
    messages=messages,
    include_reasoning=True
)

Does this resolve your issue?

@kuloud
Copy link

kuloud commented Feb 5, 2025

response = acompletion(
    model="openrouter/deepseek/deepseek-r1",
    messages=messages,
    include_reasoning=True
) 

returns reasoning in Delta, while litellm only processes the reasoning_content field.

provider_specific_fields: Dict[str, Any] = {}
if "reasoning_content" in params:
    provider_specific_fields["reasoning_content"] = params["reasoning_content"]
    setattr(self, "reasoning_content", params["reasoning_content"])

How do I need to handle this?

@krrishdholakia
Copy link
Contributor

hey @kuloud can you just file a pr to add any unmapped params to the root - this should address the concern and keep us consistent with the openai sdk, correct?

@kuloud
Copy link

kuloud commented Feb 5, 2025

hey @kuloud can you just file a pr to add any unmapped params to the root - this should address the concern and keep us consistent with the openai sdk, correct?

OK, I'll make a pr for this.

@krrishdholakia
Copy link
Contributor

cc: @vibhavbhat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
4 participants