-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: add an OpenAI function-calling experiment #2
Conversation
ed0697c
to
907fad6
Compare
907fad6
to
ab91642
Compare
name: "get_artists", | ||
description: `Get a list of artists on Artsy. Artists may be sorted chronologically by creation date, alphabetically by name, or in descending order of a popularity/trending score.`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As seen below this function (which is defined locally) results in a request to the MP root field artists(size: $size, sort: $sort)
Having concise but informative description
s seems to be a factor in nudging it towards the appropriate function calls.
name: "get_curated_artists", | ||
description: `Get a list of curated artists on Artsy. These are artists whose works have been highlighted by Artsy curators, and may change from week to week.`, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Similarly this result in a call to the MP root field curatedTrendingArtists(first: $size)
model: "gpt-3.5-turbo", | ||
temperature: 0, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
model
— I should try GPT-4 now that we have access.
temperature
— Function calling is a case where we want determinism rather than "creativity", so a low temperature is specified here.
|
||
Context | ||
''' | ||
${JSON.stringify(artists, null, 2)} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I added the null, 2
to pretty print the supplied JSON context over multiple lines — rather than compacting it into one line — after seeing the LLM misinterpret the compact version of the JSON.
A simple reliability hack noticed by others as well.
description: "The number of artists to return", | ||
default: 5, | ||
minimum: 1, | ||
maximum: 20, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does the model enforce these constraints, or could I override them via a user prompt? Something like "give me 100 curated artists".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh good thought, I hadn't tried to break out of the limits here, but I just tried it now with "give me 23 trending artists"
And… it quite obediently gave me 23 results 🤦🏽♂️
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Speaks to the importance of validating the function call signature before actually doing anything with it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💯
This adds a first experiment, using the OpenAI Function Calling capability:
Example: