Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(chat): Include history from messages to docs chatbot VSCODE-632 #871

Merged
merged 65 commits into from
Nov 19, 2024

Conversation

gagik
Copy link
Contributor

@gagik gagik commented Nov 12, 2024

Include history from messages to docs chatbot

Open Questions

Some things may be worth discussing here. cc: @GaurabAryal

Limitations

  • There's no way to just add previous messages as context (at least not yet) through the app. We can make requests and wait for a full response for each previous message which is quite long.
  • So, instead, this implementation includes previous messages by making it all prepended to the prompt with the actual user request in the end.

Risks

  1. Prompts end up unnecessarily too long and we easily reach the maximum prompt count.
  2. The earlier messages / prompts end up overtaking the question or are drastically different (i.e. user decided to ask something about docs which has nothing to do with what he was doing before and now the docs is referring to all that context and ends up less useful.

Observations

  • I think 1 is definitely a real risk especially if the user has been using the chat for a while and suddenly decides to use documentation.
  • In my testing, I don't actually see much of 2 happening even when I am trying to be nefarious; seems like the end of the prompt does take a lot of precedence, so that's good.

Mitigation Strategies

  • We can deal with 1 and 2 by introducing a small cap on how many previous messages we include, limited to as low as 2-3?
  • We could also make the prompt being sent more explained to the LLM i.e start with header "For context, previous prompts were: .... \n\n The prompt is: ....or adding a------` after the context. So far, I have found lack of clear separation not to be a problem.
  • We can also request for the API to add some context-adding endpoint but this will probably involve a lot of waiting

Dependents

Types of changes

  • Backport Needed
  • Patch (non-breaking change which fixes an issue)
  • Minor (non-breaking change which adds functionality)
  • Major (fix or feature that would cause existing functionality to change)

@gagik gagik marked this pull request as ready for review November 12, 2024 19:45
@gagik gagik requested review from Anemy and alenakhineika November 12, 2024 19:47
Copy link
Member

@Anemy Anemy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the approach of condensing the history into one message given our constraints. It might be worth coming up with a couple example chats with some history and then seeing if come prompt changes - like including in the prompt that some of the prompt is chat history and for the LLM to pay attention to the last message mostly.

Left one code suggestion

src/participant/prompts/promptBase.ts Outdated Show resolved Hide resolved
@gagik gagik requested review from alenakhineika and Anemy November 19, 2024 07:44
@gagik gagik merged commit 8a20e2a into main Nov 19, 2024
6 checks passed
@gagik gagik deleted the gagik/add-docs-history branch November 19, 2024 23:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants