Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation pipeline example doesn't work #31158

Closed
1 of 4 tasks
qgallouedec opened this issue May 31, 2024 · 6 comments · Fixed by #31165
Closed
1 of 4 tasks

Conversation pipeline example doesn't work #31158

qgallouedec opened this issue May 31, 2024 · 6 comments · Fixed by #31165

Comments

@qgallouedec
Copy link
Member

qgallouedec commented May 31, 2024

System Info

  • transformers version: 4.40.2
  • Platform: Linux-5.15.0-1048-aws-x86_64-with-glibc2.31
  • Python version: 3.10.14
  • Huggingface_hub version: 0.23.0
  • Safetensors version: 0.4.3
  • Accelerate version: 0.30.1
  • Accelerate config: not found
  • PyTorch version (GPU?): 2.3.0+cu121 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Who can help?

@stevhliu

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

>>> from transformers import pipeline, Conversation
# Any model with a chat template can be used in a ConversationalPipeline.
>>> chatbot = pipeline(model="facebook/blenderbot-400M-distill")
>>> # Conversation objects initialized with a string will treat it as a user message
>>> conversation = Conversation("I'm looking for a movie - what's your favourite one?")
>>> conversation = chatbot(conversation)

Traceback (most recent call last):
  File "/fsx/qgallouedec/trl/conv.py", line 6, in <module>
    conversation = chatbot(conversation)
  File "/fsx/qgallouedec/miniconda3/envs/trl/lib/python3.10/site-packages/transformers/pipelines/text2text_generation.py", line 167, in __call__
    result = super().__call__(*args, **kwargs)
  File "/fsx/qgallouedec/miniconda3/envs/trl/lib/python3.10/site-packages/transformers/pipelines/base.py", line 1242, in __call__
    return self.run_single(inputs, preprocess_params, forward_params, postprocess_params)
  File "/fsx/qgallouedec/miniconda3/envs/trl/lib/python3.10/site-packages/transformers/pipelines/base.py", line 1248, in run_single
    model_inputs = self.preprocess(inputs, **preprocess_params)
  File "/fsx/qgallouedec/miniconda3/envs/trl/lib/python3.10/site-packages/transformers/pipelines/text2text_generation.py", line 177, in preprocess
    inputs = self._parse_and_tokenize(inputs, truncation=truncation, **kwargs)
  File "/fsx/qgallouedec/miniconda3/envs/trl/lib/python3.10/site-packages/transformers/pipelines/text2text_generation.py", line 129, in _parse_and_tokenize
    raise ValueError(
ValueError:  `args[0]`: Conversation id: 44b49edc-7f94-45fc-ac5c-9166b7fb7b1e
user: I'm looking for a movie - what's your favourite one?
 have the wrong format. The should be either of type `str` or type `list`

Expected behavior

The example to work.

It's probably linked to the fact that the pipeline has the wrong type. When you do

chatbot = pipeline(model="facebook/blenderbot-400M-distill", task="conversational")

it works.

@LysandreJik
Copy link
Member

cc @Rocketknight1

@RUFFY-369
Copy link
Contributor

RUFFY-369 commented May 31, 2024

Hi @qgallouedec , I guess that the models on the hub needs to define their task then we don't need to provide the value for the kwarg task while creating that pipeline instance . Like for example this random model Invincible/Chat_bot-Harrypotter-small which I just checked worked without providing task kwarg. The same goes for the example in ConversationalPipeline with model facebook/blenderbot-400M-distill. So, that's why it works when you give the value above as you mentioned task="conversational"

@Rocketknight1
Copy link
Member

Hi @qgallouedec, the ConversationalPipeline is actually deprecated and will be removed soon. This functionality has been moved to TextGenerationPipeline.

For a more up-to-date guide on chatting with Transformer models, try this guide: https://huggingface.co/docs/transformers/main/en/conversations. You can swap the "meta-llama/Meta-Llama-3-8B-Instruct" model in those examples for any model you like, as long as it's been trained for chat!

@qgallouedec
Copy link
Member Author

Hi @qgallouedec, the ConversationalPipeline is actually deprecated and will be removed soon. This functionality has been moved to TextGenerationPipeline.

Thank @Rocketknight1 for your quick answer! However , my point was more about documentation than about how to use a model for conversation. The example provided in the documentation is not functional, and I think this is problematic for users who rely on these examples to understand how to use the library.

Regarding the deprecation of TextGenerationPipeline I think the best would be

  • to keep it functional as long as it is not removed,
  • warn at runtime that the user is using a deprecated function,
  • document this deprecation (see Pipelines).

For a more up-to-date guide on chatting with Transformer models, try this guide: https://huggingface.co/docs/transformers/main/en/conversations. You can swap the "meta-llama/Meta-Llama-3-8B-Instruct" model in those examples for any model you like, as long as it's been trained for chat!

Thanks for the ref, I think it would be relevant to put this link in place of all that, wdyt?:

Example:
```python
>>> from transformers import pipeline, Conversation
# Any model with a chat template can be used in a ConversationalPipeline.
>>> chatbot = pipeline(model="facebook/blenderbot-400M-distill")
>>> # Conversation objects initialized with a string will treat it as a user message
>>> conversation = Conversation("I'm looking for a movie - what's your favourite one?")
>>> conversation = chatbot(conversation)
>>> conversation.messages[-1]["content"]
"I don't really have a favorite movie, but I do like action movies. What about you?"
>>> conversation.add_message({"role": "user", "content": "That's interesting, why do you like action movies?"})
>>> conversation = chatbot(conversation)
>>> conversation.messages[-1]["content"]
" I think it's just because they're so fast-paced and action-fantastic."
```
Learn more about the basics of using a pipeline in the [pipeline tutorial](../pipeline_tutorial)
This conversational pipeline can currently be loaded from [`pipeline`] using the following task identifier:
`"conversational"`.

@Rocketknight1
Copy link
Member

@qgallouedec there is already a deprecation warning at runtime in ConversationalPipeline, although you're right that we should have added this to the documentation as well.

Regardless, it's actually due for removal in the next version, so rather than updating the documentation, I'll take this as a sign that it's time to finally remove it entirely!

@Rocketknight1
Copy link
Member

@qgallouedec PR is open at #31165!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants