Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow Part instantiation from function call #3194

Closed
ehaca opened this issue Jan 12, 2024 · 7 comments · Fixed by #3431 · May be fixed by #3196
Closed

Allow Part instantiation from function call #3194

ehaca opened this issue Jan 12, 2024 · 7 comments · Fixed by #3431 · May be fixed by #3196
Assignees
Labels
api: vertex-ai Issues related to the googleapis/python-aiplatform API.

Comments

@ehaca
Copy link

ehaca commented Jan 12, 2024

Is your feature request related to a problem? Please describe.
As a user I would like to be able to handle Chat sessions from Generative Models (especially Gemini) on my own side without using a Chat instance. To achieve this, I need to be able to create a Part instance from a function call that was proposed by the LLM the same way I can create an instance from a function response.

Describe the solution you'd like
I can use Part.from_function_call(fn_name, arguments)

Describe alternatives you've considered
Implementing the same snippet on my side, but a) I had to do wquite a bit of research to find out how to do it and I think other people could use such a functionality and b) I think the snippet belongs mote to the SDK as to my code

Additional context
I already implemented a function (5 lines of code or so), see linked PR 😄

@product-auto-label product-auto-label bot added the api: vertex-ai Issues related to the googleapis/python-aiplatform API. label Jan 12, 2024
@Ark-kun
Copy link
Contributor

Ark-kun commented Jan 18, 2024

Thank you for the feature request and the PR.
Can you please elaborate on why the current API is a blocker for you? Function call parts are always produced by the model. You can get them from the model response and use for subsequent calls. Users are not supposed to construct the function call parts. Users should only construct function response parts.
Note: You can use function calling without using chat interface.

user_content1 = Content...
model_content1 = model.generate_content([user_content1]).candidates[0].content  # function_call
user_content2 = Part.from_function_response(...)
model_content2 = model.generate_content([user_content1, model_content1, user_content2]).candidates[0].content

P.S. One reason I'm hesitant to add this feature is that I'm afraid the users can get confused and misunderstand function_call vs function_response. The whole function calling feature is already pretty complex and confusing for people who just start using it.

@Ark-kun
Copy link
Contributor

Ark-kun commented Jan 18, 2024

By the way, could you please describe your use cases for function calling?
Are there any other aspects that you'd like to see improved or extended?

@ehaca
Copy link
Author

ehaca commented Jan 19, 2024

Hey,
I feel this function is missing for users to be able to build abstractions on top of the SDK. I am using my own Message objects to handle my message history, and converting them into valid Content/Part objects is fairly easy/already been implemented for almost all kind of messages, except for the function calls. The idea is to be able to handle Messages independently from the LLM I am working with.

I am already using generate_content successfully by using the code I put in the PR, but I thought I might not be the only person building systems on top of multiple LLM, so maybe someone else can use this feature too :)

As for your question: I use function calling a lot, but I don't think I can discuss particular use cases in public, but think of use cases such as get_weather. I am currently missing the possibility to combine vision and function calling a bit 🤓

@Ark-kun Ark-kun self-assigned this Jan 23, 2024
@Ark-kun
Copy link
Contributor

Ark-kun commented Jan 23, 2024

Would Part.from_dict work for your case?
(! I've discovered an issue with the proto library we've been using, so this won't work right now. googleapis/proto-plus-python#424 I'm working on fixing the issue.)

This is how it's supposed to work:

function_call_part = Part.from_dict({
    "function_call": {
        "name": "get_current_weather"
        "args": {
            "location": "Boston, MA",
        }
    }
})

@ehaca
Copy link
Author

ehaca commented Jan 24, 2024

That would work for me 😄

@ehaca
Copy link
Author

ehaca commented Feb 27, 2024

Hey, @Ark-kun do you have infos about a timeline?

copybara-service bot pushed a commit that referenced this issue Mar 13, 2024
…ionResponse`, `Candidate`, `Content`, `Part`)

Workaround for issue in the proto-plus library: googleapis/proto-plus-python#424

Fixes #3194

PiperOrigin-RevId: 604893347
@Ark-kun
Copy link
Contributor

Ark-kun commented Mar 13, 2024

@ehaca Thank you for your patience. I made the fix/workaround some time ago, but it only got submitted today. The next release (expected very soon) will have the fix for Part.from_dict, Content.from_dict.

P.S. To save/load history it's probably simpler to use Content.to_dict/Content.from_dict rather than Part.*.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment