You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Surprisingly, Azure OpenAI this time added support for the o3-mini model on the same day!
But when a request is sent with stream set to true, the ability to fake the stream that worked with o1 models (behaving as if stream were false but returning full data by SSE in single message) doesn't work here. This won't work even with enable_preview_features set to true, and was likely hardcoded only for o1 models.
To reproduce this, simply sending a request with stream set to true will be sufficient.
What happened?
Surprisingly, Azure OpenAI this time added support for the o3-mini model on the same day!
But when a request is sent with
stream
set totrue
, the ability to fake the stream that worked with o1 models (behaving as if stream were false but returning full data by SSE in single message) doesn't work here. This won't work even withenable_preview_features
set totrue
, and was likely hardcoded only for o1 models.To reproduce this, simply sending a request with stream set to true will be sufficient.
Request:
Response:
Relevant log output
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
1.59.9
Twitter / LinkedIn details
x.com/yigitkonur
The text was updated successfully, but these errors were encountered: