Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MemGPT: AssertionError when using Airoboro's wrapper with tool messages #952

Closed
fukuro-kun opened this issue Feb 2, 2024 · 1 comment Β· Fixed by #993
Closed

MemGPT: AssertionError when using Airoboro's wrapper with tool messages #952

fukuro-kun opened this issue Feb 2, 2024 · 1 comment Β· Fixed by #993
Assignees
Labels
bug Something isn't working

Comments

@fukuro-kun
Copy link

fukuro-kun commented Feb 2, 2024

Describe the bug
After I ran memgpt configure, I tried to run memgpt:

memgpt run

🧬 Creating new agent...
->  πŸ€– Using persona profile 'anna_pa'
->  πŸ§‘ Using human profile 'edo'
Downloading config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ...β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 743/743 [00:00<00:00, 1.49MB/s]
Downloading model.safetensors: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ...β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 133M/133M [00:07<00:00, 18.3MB/s]
Downloading tokenizer_config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ...β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 366/366 [00:00<00:00, 671kB/s]
Downloading vocab.txt: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ...β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 232k/232k [00:00<00:00, 2.04MB/s]
Downloading tokenizer.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ...β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 711k/711k [00:00<00:00, 1.91MB/s]
Downloading (…)cial_tokens_map.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ...β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 125/125 [00:00<00:00, 315kB/s]
πŸŽ‰ Created new agent 'DedicatedGoblin' (id=cf9f9576-ee61-4fe0-9e1b-a514772c9833)

Then, the prompt appeared: "Hit enter to begin (will request first MemGPT message)". After hitting enter I got the following error and was unable to start the chat:

An exception occurred when running agent.step(): 
Traceback (most recent call last):
  File "/media/fukuro/raid5/MemGPT/memgpt/local_llm/chat_completion_proxy.py", line 135, in get_chat_completion
    prompt = llm_wrapper.chat_completion_to_prompt(messages, functions, function_documentation=documentation)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/media/fukuro/raid5/MemGPT/memgpt/local_llm/llm_chat_completion_wrappers/airoboros.py", line 339, in chat_completion_to_prompt
    assert message["role"] in ["user", "assistant", "function"], message
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: {'content': '{"status": "OK", "message": null, "time": "2024-02-02 12:44:34 AM CET+0100"}', 'role': 'tool', 'tool_call_id': '3e5675fc-64eb-4211-b685-df4a00fe777f'}

This error indicates that one of the messages passed to the chat_completion_to_prompt() method of the Airoboros wrapper has an unexpected role.
The wrapper seems to expect each message to have one of the roles "user", "assistant" or "function". However, the error message shows that a message has the role "tool", which leads to an AssertionError.

Please describe your setup

  • MemGPT version: 0.3.0
  • How did you install memgpt? - I cloned the GitHub Repo and installed it from source:
git clone [email protected]:cpacker/MemGPT.git
pip install -e '.[local]'

Describe your setup

  • Ubuntu Mate 23.10

How are you running memgpt?

  • (Anaconda Shell Environment)
  • Python Version: 3.11.6

Screenshots
Not needed, all available code output is above.

Additional context
My inputs for setting up the agent:

memgpt configure 
? Select LLM inference provider: local
? Select LLM backend (select 'openai' if you have an OpenAI compatible proxy): webui
? Enter default endpoint: http://localhost:5000
? Select default model wrapper (recommended: chatml): airoboros-l2-70b-2.1   # (<- as recommended from modelmaker)
? Is your LLM endpoint authenticated? (default no) No
? Select your model's context window (for Mistral 7B models, this is probably 8k / 8192): 16384
? Select embedding provider: local
? Select default preset: memgpt_chat
? Select default persona: anna_pa
? Select default human: edo
? Select storage backend for archival data: chroma
? Select chroma backend: persistent
? Select storage backend for recall data: sqlite

Plus, I use local LLM (see below).

  • Choosing the chatml wrapper lets me start the conversation as expected. (But I am not sure, how stable conversation will be, though - I do want to use the airoboro wrapper.).

If you're not using OpenAI, please provide additional information on your local LLM setup:

Local LLM details

The exact model you're trying to use:

  • memgpt-q8_0.gguf from hf: starsnatched/MemGPT-GGUF

The local LLM backend you are using

  • recent version of oobabooga web UI

Your hardware for the local LLM backend

  • local computer, same operating system, Ubuntu Mate 23.10
@sarahwooders
Copy link
Collaborator

@cpacker we should support both "tool" and "function" as roles, right?

@cpacker cpacker moved this from To triage to Ready in πŸ› MemGPT issue tracker Feb 8, 2024
@cpacker cpacker self-assigned this Feb 8, 2024
@cpacker cpacker moved this from Ready to In progress in πŸ› MemGPT issue tracker Feb 11, 2024
@cpacker cpacker linked a pull request Feb 12, 2024 that will close this issue
@github-project-automation github-project-automation bot moved this from In progress to Done in πŸ› MemGPT issue tracker Feb 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants