Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Xformers is not installed correctly. #24903

Closed
4 tasks
david-waterworth opened this issue Jul 19, 2023 · 4 comments · Fixed by #24960
Closed
4 tasks

Xformers is not installed correctly. #24903

david-waterworth opened this issue Jul 19, 2023 · 4 comments · Fixed by #24960

Comments

@david-waterworth
Copy link

david-waterworth commented Jul 19, 2023

System Info

  • transformers version: 4.30.2
  • Platform: Linux-5.15.0-76-generic-x86_64-with-glibc2.35
  • Python version: 3.10.6
  • Huggingface_hub version: 0.16.4
  • Safetensors version: 0.3.1
  • PyTorch version (GPU?): 2.0.1+cu117 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: yes
  • Using distributed or parallel set-up in script?: no

Who can help?

No response

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from transformers import pipeline
pipe = pipeline("text-classification", model="roberta-base", device=0)

Edit: I know this model isn't trained for the "text-classification" task, I get the same problem with a private model I fine tuned.

Results in the message

...
Xformers is not installed correctly. If you want to use memory_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.

But I'm using torch==2.0.1 and memory-efficient-attention states "If you have PyTorch 2.0 installed, you shouldn’t use xFormers!"

The message is confusing - I have torch 2.0 installed and pipeline is for inference. This message doesn't occur if I use AutoModelForSequenceClassification.from_pretrained

Expected behavior

The documentation or the warning message are inconsistent.

@sgugger
Copy link
Collaborator

sgugger commented Jul 19, 2023

It looks like the pipeline is back to importing every model (this message comes from trying to access an unrelated model). I'll have a look later this week. You can ignore that warning in the meantime, it's irrelevant.

@sgugger
Copy link
Collaborator

sgugger commented Jul 20, 2023

Should be fixed by the PR linked above.

@engageintellect
Copy link

same issue for me on this basic example:

import argparse
from transformers import pipeline

# Create the parser
parser = argparse.ArgumentParser(description="Perform sentiment analysis")

# Add an argument
parser.add_argument('Text', type=str, help="the text to analyze")

# Parse the argument
args = parser.parse_args()

# Load the classifier
classifier = pipeline("sentiment-analysis", model="distilbert-base-uncased-finetuned-sst-2-english")

# Perform sentiment analysis
res = classifier(args.Text)

# Print the result
print(res)

Reinstalled transformers, using v.4.31.0

@sgugger
Copy link
Collaborator

sgugger commented Aug 18, 2023

The fix is not in v4.31.0, you will need to use a source install.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants