Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Unexpected responses on regeneration #4423

Closed
1 of 3 tasks
phansel opened this issue Jan 9, 2025 · 3 comments · Fixed by #4454
Closed
1 of 3 tasks

bug: Unexpected responses on regeneration #4423

phansel opened this issue Jan 9, 2025 · 3 comments · Fixed by #4454
Assignees
Labels
category: model support Support new model, or fix broken model type: bug Something isn't working
Milestone

Comments

@phansel
Copy link

phansel commented Jan 9, 2025

Jan version

v0.5.13

Describe the Bug

Llama 3.1 70B Instruct Q4 responds in an unexpected way after a response is removed and regenerated. It seems to be following up on the message that was regenerated, rather than regenerating from scratch.

It seems like this would happen if the deleted (regenerated) message is left in the chat log until the LM finishes sending the latest reply, rather than before it starts. Any follow-up questions posed after that last message get replies like "Hmm, it looks like I didn't answer your question originally."

Steps to Reproduce

Repro

  1. Ask a question of the model
  2. Observe reply
  3. Regenerate the answer
  4. Observe that the regenerated reply is not an answer to the question

Screenshots / Logs

Example

User 08:39:12 PM
Give me an airline's name.

Assistant 08:42:00 PM
Here is the name of an airline:
Lufthansa

User clicks "Regenerate" button on last message. The last message is replaced with this:

Assistant 08:47:35 PM
Would you like more information about Lufthansa or would you like another airline's name?

Pressing Regenerate on the last message again yields this:

Assistant 08:48:45 PM
Would you like another one?

What is your OS?

  • MacOS
  • Windows
  • Linux
@phansel phansel added the type: bug Something isn't working label Jan 9, 2025
@github-project-automation github-project-automation bot moved this to Investigating in Menlo Jan 9, 2025
@imtuyethan
Copy link
Contributor

This is a legacy issues, i had it many times in many versions but since it's model related so I didn't report.

@imtuyethan imtuyethan added the category: model support Support new model, or fix broken model label Jan 9, 2025
@phansel
Copy link
Author

phansel commented Jan 9, 2025

This is a legacy issues, i had it many times in many versions but since it's model related so I didn't report.

Which models see it? The only other model I've tried (Llama 3.2 3B Instruct Q8) also shows it.

@louis-menlo
Copy link
Contributor

@phansel @imtuyethan I also encountered an issue where regenerate sends the latest assistant message, which shouldn't, resulting in an incorrect question context. Additionally, regenerating anthropic models can sometimes lead to an empty response. These issues will be fixed in the next release.

@louis-menlo louis-menlo moved this from Investigating to Scheduled in Menlo Jan 14, 2025
@louis-menlo louis-menlo moved this from Scheduled to In Progress in Menlo Jan 14, 2025
@louis-menlo louis-menlo moved this from In Progress to Eng Review in Menlo Jan 15, 2025
@github-project-automation github-project-automation bot moved this from Eng Review to QA in Menlo Jan 15, 2025
@imtuyethan imtuyethan added this to the v0.5.14 milestone Jan 20, 2025
@imtuyethan imtuyethan moved this from QA to Completed in Menlo Jan 21, 2025
@imtuyethan imtuyethan modified the milestones: v0.5.14, v0.5.15 Jan 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
category: model support Support new model, or fix broken model type: bug Something isn't working
Projects
Status: Completed
Development

Successfully merging a pull request may close this issue.

3 participants