Skip to content

Commit

Permalink
Docs fix: Multinomial sampling decoding needs "num_beams=1", since by…
Browse files Browse the repository at this point in the history
… default it is usually not 1. (huggingface#22473)

Fix: Multinomial sampling needs "num_beams=1", since by default is 5.
  • Loading branch information
manueldeprada authored and raghavanone committed Apr 5, 2023
1 parent 1d2f628 commit 49ae6c8
Showing 1 changed file with 3 additions and 3 deletions.
6 changes: 3 additions & 3 deletions docs/source/en/generation_strategies.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -216,11 +216,11 @@ We pride ourselves on being the best in the business and our customer service is
### Multinomial sampling

As opposed to greedy search that always chooses a token with the highest probability as the
next token, multinomial sampling randomly selects the next token based on the probability distribution over the entire
next token, multinomial sampling (also called ancestral sampling) randomly selects the next token based on the probability distribution over the entire
vocabulary given by the model. Every token with a non-zero probability has a chance of being selected, thus reducing the
risk of repetition.

To enable multinomial sampling set `do_sample=True`.
To enable multinomial sampling set `do_sample=True` and `num_beams=1`.

```python
>>> from transformers import AutoTokenizer, AutoModelForCausalLM
Expand All @@ -232,7 +232,7 @@ To enable multinomial sampling set `do_sample=True`.
>>> prompt = "Today was an amazing day because"
>>> inputs = tokenizer(prompt, return_tensors="pt")

>>> outputs = model.generate(**inputs, do_sample=True, max_new_tokens=100)
>>> outputs = model.generate(**inputs, do_sample=True, num_beams=1, max_new_tokens=100)
>>> tokenizer.batch_decode(outputs, skip_special_tokens=True)
['Today was an amazing day because we are now in the final stages of our trip to New York City which was very tough. \
It is a difficult schedule and a challenging part of the year but still worth it. I have been taking things easier and \
Expand Down

0 comments on commit 49ae6c8

Please sign in to comment.