Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

avoid crash when response has no attachments #595

Closed

Conversation

erikdw
Copy link

@erikdw erikdw commented Oct 29, 2024

Description

Symptoms

I noticed llm chat was crashing a ... lot.

Tracebacks looked like this:

Traceback (most recent call last):
  File "/opt/homebrew/bin/llm", line 8, in <module>
    sys.exit(cli())
             ~~~^^
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ~~~~~~~~~^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/llm/cli.py", line 535, in chat
    for chunk in response:
                 ^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/llm/models.py", line 169, in __iter__
    for chunk in self.model.execute(
                 ~~~~~~~~~~~~~~~~~~^
        self.prompt,
        ^^^^^^^^^^^^
    ...<2 lines>...
        conversation=self.conversation,
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ):
    ^
  File "/opt/homebrew/Cellar/llm/0.17/libexec/lib/python3.13/site-packages/llm/default_plugins/openai_models.py", line 315, in execute
    if prev_response.attachments:
       ^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'Response' object has no attribute 'attachments'

Problem

The python code is not checking if there is an attachments attribute before accessing it.

Introduction of Issue

Came in this change: #590

Solution

Check for existence of attachments attribute before accessing.

Testing

Made the change to my local version of the code in my homebrew install path, and the crashes went away.

@simonw
Copy link
Owner

simonw commented Nov 6, 2024

Thanks for this - I landed a fix already in:

@simonw simonw closed this Nov 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants