-
-
Notifications
You must be signed in to change notification settings - Fork 324
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replying to an async conversation does not work #632
Comments
Surprisingly this new test passes: Lines 20 to 30 in 157b29d
|
To paste into a import llm
model = llm.get_async_model("gpt-4o-mini")
c = model.conversation()
print(await c.prompt('two jokes about a duck'))
print(await c.prompt("walrus")) |
I dropped into the debugger and found the problem:
It's here: llm/llm/default_plugins/openai_models.py Lines 352 to 355 in f90f29d
llm/llm/default_plugins/openai_models.py Lines 372 to 378 in f90f29d
I'm calling So that But it has a bunch of logic that I'd rather not duplicate. |
Could we assume that |
diff --git a/llm/default_plugins/openai_models.py b/llm/default_plugins/openai_models.py
index 82f737c..3d9816e 100644
--- a/llm/default_plugins/openai_models.py
+++ b/llm/default_plugins/openai_models.py
@@ -375,7 +375,7 @@ class _Shared:
messages.append(
{"role": "user", "content": prev_response.prompt.prompt}
)
- messages.append({"role": "assistant", "content": prev_response.text()})
+ messages.append({"role": "assistant", "content": prev_response.text_or_raise()})
if prompt.system and prompt.system != current_system:
messages.append({"role": "system", "content": prompt.system})
if not prompt.attachments:
diff --git a/llm/models.py b/llm/models.py
index cb9c7ab..7b61411 100644
--- a/llm/models.py
+++ b/llm/models.py
@@ -393,6 +393,11 @@ class AsyncResponse(_BaseResponse):
pass
return self
+ def text_or_raise(self) -> str:
+ if not self._done:
+ raise ValueError("Response not yet awaited")
+ return "".join(self._chunks)
+
async def text(self) -> str:
await self._force()
return "".join(self._chunks) That does seem to fix it. |
The text was updated successfully, but these errors were encountered: