You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Add logits processors for token suppression and forced tokens at specific indices.
Enable prompting the decoder of encoder-decoder models with decoder_input_ids.
Motivation
Currently, the flax generation utilities do not support token suppression, forcing specific tokens to be decoded at specific response indices, nor prompting the decoder (helpful for models like Whisper that support decoder prompts - Flax Whisper is implemented in #20479). Adding these would move the flax utilities closer to feature parity with the pytorch generation utilities. Adding these features would fully unlock a flax implementation of Whisper inference.
Your contribution
I already have these features implemented in a branch of my fork - happy to open a PR!
The text was updated successfully, but these errors were encountered:
Feature request
Add logits processors for token suppression and forced tokens at specific indices.
Enable prompting the decoder of encoder-decoder models with decoder_input_ids.
Motivation
Currently, the flax generation utilities do not support token suppression, forcing specific tokens to be decoded at specific response indices, nor prompting the decoder (helpful for models like Whisper that support decoder prompts - Flax Whisper is implemented in #20479). Adding these would move the flax utilities closer to feature parity with the pytorch generation utilities. Adding these features would fully unlock a flax implementation of Whisper inference.
Your contribution
I already have these features implemented in a branch of my fork - happy to open a PR!
The text was updated successfully, but these errors were encountered: