-
Notifications
You must be signed in to change notification settings - Fork 27.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix paligemma detection inference #31587
Fix paligemma detection inference #31587
Conversation
cc @zucchini-nlp, a generation bugfix you might want to look at |
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Verified it works, I can now replicate the same results with and without use_cache
. Thank you! 🔥
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
thanks 🤗
Is this fix already integrated to the last release https://github.com/huggingface/transformers/releases ? I am not sure release notes are supposed to be exhaustive @ArthurZucker (PS: merci de représenter le MVA) |
😉 sympa de croiser un alumni! |
I really don't see it, no mention of "paligemma" "slow" or "extended" in the page I sent for "v4.42.0: Gemma 2, RTDETR, InstructBLIP, LLAVa Next, New Model Adder" but maybe I have "merde dans les yeux". (oui c'est sympa surtout quand on voit le succès de certains comme toi) So to conclude maybe I don't see it, and if its the case please accept my apologies for wasting your time. Either way thank you for confirming me the fix was added, good continuation. |
What does this PR do?
Fixes #31425
hf hub discussion: https://huggingface.co/google/paligemma-3b-mix-448/discussions/6
Who can review?
@pcuenca @ArthurZucker