Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running very slow and prompt warning on "not used weights" #13

Open
chris-opendata opened this issue Jul 9, 2024 · 0 comments
Open

Comments

@chris-opendata
Copy link

chris-opendata commented Jul 9, 2024

I followed the instructions to evaluate summarization using default text2text but no_cuda=False to evaluate (source documents, references, and generated summaries). After running through several batches of samples (e.g. 2560 samples), the evaluation suddenly slows down almost to halt.
It also raised the following warnings from time to time:

Some weights of the model checkpoint at bert-base-multilingual-cased were not used when initializing BertModel: ['cls.seq_relationship.weight', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.transform.dense.weight', 'cls.seq_relationship.bias', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.LayerNorm.bias']

  • This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
  • This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).

I also noticed the following logged messages like:

python3.8/site-packages/datasets/metric.py - Removing .cache/huggingface/metrics/bert_score/default/default_experiment-1-0.arrow

I have also tried the summarization option with (task="summarization", do_weighter=True, use_cache=True, no_cuda=False) and had the same problems.

I installed the following version:

questeval==0.2.4

Any help?

Thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant