-
Notifications
You must be signed in to change notification settings - Fork 100
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make chat memory available to the system message template #887
Conversation
geoand
commented
Sep 11, 2024
•
edited
Loading
edited
- Closes: Inject ChatMemory into the Prompt #881
@geoand , the
I will change this behaviour as well. |
I'm looking at the implementation and I think there are some things that limit the use of the Let me try to explain with an example. @ApplicationScoped
@RegisterAiService(chatMemoryProviderSupplier = NoChatMemoryProviderSupplier.class)
public interface RephraseService {
@SystemMessage("""
Given the following conversation and a follow-up question, rephrase the follow-up question to be a
standalone question, in its original language. Return the follow-up question VERBATIM if the
question is changing the topic with respect to the conversation. It is **VERY IMPORTANT** to only
output the rephrased standalone question; do not add notes or comments in the output.
Chat History:
------
{chat_memory}
------
""")
public String rephrase(@MemoryId String id, @UserMessage String question);
} @ApplicationScoped
@RegisterAiService
public interface AiService {
@SystemMessage("You are a helpful assistant")
public String answer(@MemoryId String id, @UserMessage String question);
} String question = rephraseService.rephrase("chatMemoryId", message);
return aiService.answer("chatMemoryId", question); The goal of this chain is to take the conversation history and the last question asked by the user. Based on the input, it rephrases the question and passes the result to the second prompt. The result of the first prompt must not be saved, otherwise there would be duplicates in the conversation, but the Is there a simple way to get around this behaviour? Otherwise, I don't know how useful this new placeholder can be. (Maybe there's some scenario I'm missing). At this point, I'd just leave the option of using Qute, which doesn't cause any problems. WDYT? |
Maybe I am missing something, but how would the option of using chat memory yourself with Qute differ? |
The problem is in the Ai service class that is annotated with |
Okay, I would like to see a sample application of what you are trying to do when you have time to add it |
Yes, when I pass the ChatMessage(s) manually everything works.
Sorry, I don't understand. The history of the messages contains always what I'm expecting. |
Okay, I guess I need to see an example in action of what you are trying to achieve :) |
I can write a test and commit it to the branch with the results I expect. In this case, you can run it and see the result. Or would you prefer a new application? |
That's perfectly fine |
I don't have the grant to commit to this PR, but you can see the test scenario at this link |
Thanks, I'll have a look soon |
Can you accept the collaboration invitation? After that, you should be able to push to my branch |
To run the test: |
🙏🏽 |
So the first thing I notice is that I didn't expect Also, when there is no memory associated with the AiService, for whatever reason, the variable is an empty list, which seems reaonsable, no? |
In my case, what I have done is to format the list of messages in a "default" format, but this is something that can be changed.
This is exactly the scenario I'm trying to solve (and that it works if I do all steps "manually"). I have two AiServices one of them doesn't need to store the result of the messages but it should have the possibility to read them! At this point I was wondering if it makes sense to add this new placeholder anyway or just do all the steps manually and finally use Qute (like here). |
I think that
At this point I was wondering if it makes sense to add this new placeholder anyway or just do all the steps manually and finally use Qute (like #881 (comment)). Okay I see. In that case there is no we can no what you are using as memory, and you would have to specify the items manually. |
Ok, now I understand (Qute is something new to me). At this point the Template Extension Methods is a good way to proceed. |
👍🏽 |
So would you like to update the PR or should I? |
I can do that. I'm thinking about what kind of methods to create in the template extension.
Other methods could be created to minimize typing the qute template into the prompt. Sounds good? |
Sounds good to me! Feel free to add whatever methods you feel make sense. |
I have made some changes to the code to achieve what we have in mind. If there's nothing to change, the code part can be considered closed. What is missing is the documentation. I'm thinking of creating a new section under the AI Services menu to talk about this new feature. WDYT? |
Agreed on both counts :) |
Thanks! I am marking the PR as ready for review. @cescoffier mind taking a quick look? |
@@ -428,6 +430,7 @@ private static Optional<SystemMessage> prepareSystemMessage(AiServiceMethodCreat | |||
} | |||
|
|||
templateParams.put(ResponseSchemaUtil.templateParam(), createInfo.getResponseSchemaInfo().outputFormatInstructions()); | |||
templateParams.put("chat_memory", previousChatMessages); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not related to this PR, but it would be great to have a list of all the variables we handle (like current_date, response_schema...)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, we should improve the docs to include all these