Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Provide initial values to exitcode, exitcode2str and logs #1268

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

marcos-venicius
Copy link

@marcos-venicius marcos-venicius commented Jan 19, 2024

When using gpt-3.5-turbo-1106 i got the error bellow

User_Proxy (to chat_manager):

Find a latest paper about gpt-4 on arxiv and find its potential applications in software

--------------------------------------------------------------------------------

>>>>>>>> USING AUTO REPLY...
Traceback (most recent call last):
  File "/home/marcos_souza/Projects/auto-gen/2/./main.py", line 46, in <module>
    user_proxy.initiate_chat(
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 521, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 324, in send
    recipient.receive(message, self, request_reply, silent)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 452, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 767, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/groupchat.py", line 118, in run_chat
    reply = speaker.generate_reply(sender=self)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 767, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 635, in generate_code_execution_reply
    return True, f"exitcode: {exitcode} ({exitcode2str})\nCode output: {logs}"
UnboundLocalError: local variable 'exitcode' referenced before assignment

This happens because the logs, exitcode and exitcode2str does not have an "initial value", so, when we not have messages no code is executed and the exitcode, logs and exitcode2str is not initiated, so, the code breaks as we can see above.

Why are these changes needed?

Related issue number

Checks

marcos-venicius and others added 2 commits January 19, 2024 13:51
When using `gpt-3.5-turbo-1106` i got the error bellow

```console
User_Proxy (to chat_manager):

Find a latest paper about gpt-4 on arxiv and find its potential applications in software

--------------------------------------------------------------------------------

>>>>>>>> USING AUTO REPLY...
Traceback (most recent call last):
  File "/home/marcos_souza/Projects/auto-gen/2/./main.py", line 46, in <module>
    user_proxy.initiate_chat(
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 521, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 324, in send
    recipient.receive(message, self, request_reply, silent)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 452, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 767, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/groupchat.py", line 118, in run_chat
    reply = speaker.generate_reply(sender=self)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 767, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 635, in generate_code_execution_reply
    return True, f"exitcode: {exitcode} ({exitcode2str})\nCode output: {logs}"
UnboundLocalError: local variable 'exitcode' referenced before assignment
```

This happens because the `logs`, `exitcode` and `exitcode2str` does not have an "initial value", so, when we not have messages no code is executed and the `exitcode`, `logs` and `exitcode2str` is not initiated, so, the code breaks as we can see above.
@marcos-venicius
Copy link
Author

@marcos-venicius please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

@microsoft-github-policy-service agree

@sonichi
Copy link
Contributor

sonichi commented Jan 22, 2024

Could you move the discussion to https://github.com/microsoft/autogen ? AutoGen has been developed in its own repo since October.
And please try the latest version of pyautogen. This error shouldn't happen in my understanding.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants