Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chat conversations - How #52

Open
Francks11 opened this issue Jan 2, 2025 · 8 comments
Open

Chat conversations - How #52

Francks11 opened this issue Jan 2, 2025 · 8 comments
Assignees
Labels
enhancement New feature or request

Comments

@Francks11
Copy link

Hello,

I used "Chat conversations" to send messages and keep the history of the conversation.

I am looking for information on how to upload a file in the chat. I couldn't find anything in the documentation.

Currently I used this part of code to send a message with an attachment. I would like the same with a conversation.

` public static async Task GenerateContentAsync(String modelName, String message, int maxOutputTokens, String files)
{
var generationConfig = new GenerationConfig();
generationConfig.MaxOutputTokens = maxOutputTokens;

        var genAi = new GoogleAI(GEMINI_API_KEY);
        var generativeModel = genAi.GenerativeModel(modelName, generationConfig);
        var request = new GenerateContentRequest(message);

        if (!String.IsNullOrEmpty(files))
        {
            foreach (String file in files.Split('|'))
            {
                var fileUploadResponse = await generativeModel.UploadFile(file, Path.GetFileName(file));
                request.AddMedia(fileUploadResponse.File);
            }
        }

        var response = await generativeModel.GenerateContent(request);

        return String.Format("{0}\n{1}", response?.UsageMetadata?.TotalTokenCount, response?.Text);
    }`

Thank you for your help.

@Francks11 Francks11 reopened this Jan 2, 2025
@jochenkirstaetter
Copy link
Contributor

jochenkirstaetter commented Jan 3, 2025

Hi @Francks11

Thanks for the feedback and use case of adding files into a chat conversation.
Right now, this part is not polished yet and therefore a bit rough to use. I take your issue here on my ToDo to improve the handling.

The first parameter of ChatSession.SendMessage is declared as type object and two variations are processed:

  • string
  • List<Part>

Meaning, at the moment you would have to assemble the Parts list yourself and then pass it into the method SendMessage as first parameter. Again, this is not very user-friendly for now. Following some untested code written based on the current implementation found in the above mentioned elements.

// your code to this point.
var chat = generativeModel.StartChat(history: null);   // or previous history restored.
var prompt = "Your prompt using the file...";
var fileUploadResponse = await generativeModel.UploadFile(file, Path.GetFileName(file));

// Create a combined request for the chat session.
var content = new List<Part>();
// adapted from GenerateContentRequest constructor
content.Add(new Part { Text = prompt });    
// adapted from GenerateContentRequest.AddMedia(FileResource file)
content.Add(new FileData { FileUri = fileUploadResponse.File.Uri, MimeType = fileUploadResponse.File.MimeType });    

var response = await chat.SendMessage(content);  // or SendMessageStream(content);

Hope this helps already.

FYI, I'm going to improve the signature of SendMessage to accept a type of GenerateContentRequest (or similar) to make it easier.

@jochenkirstaetter jochenkirstaetter self-assigned this Jan 3, 2025
@jochenkirstaetter jochenkirstaetter added the enhancement New feature or request label Jan 3, 2025
@Francks11
Copy link
Author

Francks11 commented Jan 6, 2025

Hello @jochenkirstaetter, thank you for your answer.

I try it and its works.

I just update this row because it is not working (FileData is descendant of IPart, not Part) :

content.Add(new FileData { FileUri = fileUploadResponse.File.Uri, MimeType = fileUploadResponse.File.MimeType });

by

Part part = new Part(); part.FileData = part.FromUri(uploadResponse.File.Uri, uploadResponse.File.MimeType); content.Add(part)

Have a nice day.

Franck

@jochenkirstaetter
Copy link
Contributor

Hello @Francks11

Thank you so much for your feedback.
I also noticed that there's already a test case Create_From_Chat which covers your use case.

You could shorten/simplify the list like this:

List<Part> parts =
[
    new() { Text = "Hi, could you summarize this transcript?" },
    new( new FileData { FileUri = uploadResponse.File.Uri, MimeType = uploadResponse.File.MimeType })
];

Anyway, the next release is going to incorporate your use case and will offer a simpler way to handle this.

Cheers, JoKi

@jochenkirstaetter
Copy link
Contributor

Hello @Francks11

New release v2.0.2 provides overloaded SendMessage and SendMessageStream methods accepting a GenerateContentRequest instance directly. Now, it works analogue to the sample of GenerateContent method.

// your code to this point.
var chat = generativeModel.StartChat(history: null);   // or with previous history restored.
var prompt = "Your prompt using the file...";

var request = new GenerateContentRequest(prompt);
if (!String.IsNullOrEmpty(files))
{
    foreach (String file in files.Split('|'))
    {
        var fileUploadResponse = await generativeModel.UploadFile(file, Path.GetFileName(file));
        request.AddMedia(fileUploadResponse.File);
    }
}

var response = await chat.SendMessage(request);

The chat history is maintained, too.

Hope this helps. Kindly let me know whether this works for you. Thanks.

Cheers, JoKi

@Francks11
Copy link
Author

Thank you for the quick release, but when I try to use it, I got an error :

System.ArgumentException
HResult=0x80070057
Message=Mscc.GenerativeAI.GenerateContentRequest Arg_ParamName_Name
Source=Mscc.GenerativeAI
Arborescence des appels de procédure :
à Mscc.GenerativeAI.ChatSession.d__14.MoveNext()
à System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
à System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
à System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
à System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
à GeminiAIClient.GeminiChatHelper.SendMessageWithFile(String message, String filePath) dans C:\Users\Franck\Documents\Visual Studio 2022\GeminiAIClient\Program2.cs :ligne 100
à GeminiAIClient.GeminiChatHelper.Main(String[] args) dans C:\Users\Franck\Documents\Visual Studio 2022\GeminiAIClient\Program2.cs :ligne 202

Cette exception a été levée à l'origine dans cette pile des appels :
[Code externe]
GeminiAIClient.GeminiChatHelper.SendMessageWithFile(string, string) dans Program2.cs
GeminiAIClient.GeminiChatHelper.Main(string[]) dans Program2.cs

Example of the code :

`
var request = new GenerateContentRequest(message);

// upload file
var uploadResponse = googleAi.UploadFile(filePath, Path.GetFileName(filePath)).GetAwaiter().GetResult();
request.AddMedia(uploadResponse.File);

// send message
var response = chatSession.SendMessage(request).GetAwaiter().GetResult();
`

Franck

@jochenkirstaetter
Copy link
Contributor

jochenkirstaetter commented Jan 6, 2025

Hello @Francks11

Not sure what the root cause is.
Maybe your uploaded files are still in state PROCESSING instead of ACTIVE depending on their size. Please check the State of the uploaded file prior to using the generated FileResource.

I created a new test for multimodal chat and it completes successfully.

            // Arrange
            var systemInstruction = new Content("You are an expert analyzing transcripts.");
            var genAi = new GoogleAI(apiKey: fixture.ApiKey);
            var model = genAi.GenerativeModel(_model, systemInstruction: systemInstruction);
            var chat = model.StartChat();
            var filePath = Path.Combine(Environment.CurrentDirectory, "payload", "a11.txt");
            var document = await genAi.UploadFile(filePath);
            var request = new GenerateContentRequest("Hi, could you summarize this transcript?");
            request.AddMedia(document.File);

            // Act
            var response = await chat.SendMessage(request);
            output.WriteLine($"model: {response.Text}");
            output.WriteLine("----------");
            response = await chat.SendMessage("Okay, could you tell me more about the trans-lunar injection");
            output.WriteLine($"model: {response.Text}");
            
            // Assert
            model.Should().NotBeNull();
            chat.History.Count.Should().Be(4);
            response.Should().NotBeNull();
            response.Candidates.Should().NotBeNull().And.HaveCount(1);
            response.Text.Should().NotBeNull();
            output.WriteLine($"model: {response.Text}");

See here: Start_Chat_With_Multimodal_Content for all details.

Hopefully this helps.

Cheers, JoKi

@jochenkirstaetter
Copy link
Contributor

Hello @Francks11

Which one of the two GetAwaiter calls is causing the problem? Is it eventually the file upload?
It's not obvious to me given your provided source code, sorry.

Cheers, JoKi

@jochenkirstaetter
Copy link
Contributor

Hello @Francks11

Is this issue persisting or did you manage to resolve it?
Asking whether to close this issue or keep it open.

Cheers, JoKi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants