-
Notifications
You must be signed in to change notification settings - Fork 15.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
openai[patch]: default to invoke on o1 stream() #27983
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
) -> bool: | ||
if isinstance(response_format, type) and is_basemodel_subclass(response_format): | ||
# TODO: Add support for streaming with Pydantic response_format. | ||
warnings.warn("Streaming with Pydantic response_format not yet supported.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should we also emit a warning in case of o1
, for clarity about why results aren't streamed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
maybe. just seems like model is something you're more likely to let end users configure (eg the way langsmith playground does) in which case warnings would be annoying and unhelpful
) -> bool: | ||
if isinstance(response_format, type) and is_basemodel_subclass(response_format): | ||
# TODO: Add support for streaming with Pydantic response_format. | ||
warnings.warn("Streaming with Pydantic response_format not yet supported.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI this warning is raised even when a non-streaming API is used (e.g. ainvoke). This is because _generate_with_cache
(called by invoke
) calls _should_stream
internally:
# If stream is not explicitly set, check if implicitly requested by |
I don't think it has any bad side effect, besides that the warning used to be raised for a real issue, and it may now be a false positive. However, it is possible that invoke
used to stream beforehand and won't anymore 🤔
@baskaryan I'd love to know if we can safely bump if we are using ainvoke
with structured_outputs + pydantic
No description provided.