Skip to content

Commit

Permalink
Fix pad_token_tensor is None in warning (huggingface#34005)
Browse files Browse the repository at this point in the history
Fix pad_token_tensor is None in warning
  • Loading branch information
tshu-w authored and BernardZach committed Dec 6, 2024
1 parent fb4f127 commit 4e39500
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/generation/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1866,8 +1866,8 @@ def _tensor_or_none(token, device=None):
"The attention mask and the pad token id were not set. As a consequence, you may observe "
"unexpected behavior. Please pass your input's `attention_mask` to obtain reliable results."
)
logger.warning(f"Setting `pad_token_id` to `eos_token_id`:{pad_token_tensor} for open-end generation.")
pad_token_tensor = eos_token_tensor[0]
logger.warning(f"Setting `pad_token_id` to `eos_token_id`:{pad_token_tensor} for open-end generation.")

# Sanity checks/warnings
if self.config.is_encoder_decoder and decoder_start_token_tensor is None:
Expand Down

0 comments on commit 4e39500

Please sign in to comment.