You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I met an OOM problem when using BertTokenizer as #1539 reports.
Then I use tokenizer._tokenizer.model.clear_cache() or tokenizer._tokenizer.model._clear_cache() to clear cache.
However, I met an error: AttributeError: 'tokenizers.models.WordPiece' object has no attribute 'clear_cache', could anyone tell me how to fix it?
In the source code, It seems like clear_cache only supports BPE and Unigram tokenizer, not wordpiece tokenizer, is it the reason? if it is, could anyone give me some advice to fix this problem?
environment:
run on linux with only cpu
tokenizers==0.21.0
transformers==4.49.0
The text was updated successfully, but these errors were encountered:
nixonjin
changed the title
ERROR occur where running "tokenizer._tokenizer.model.clear_cache()"
ERROR occurs where running "tokenizer._tokenizer.model.clear_cache()"
Feb 21, 2025
nixonjin
changed the title
ERROR occurs where running "tokenizer._tokenizer.model.clear_cache()"
ERROR occurs when running "tokenizer._tokenizer.model.clear_cache()"
Feb 21, 2025
I met an OOM problem when using BertTokenizer as #1539 reports.
Then I use tokenizer._tokenizer.model.clear_cache() or tokenizer._tokenizer.model._clear_cache() to clear cache.
However, I met an error: AttributeError: 'tokenizers.models.WordPiece' object has no attribute 'clear_cache', could anyone tell me how to fix it?
In the source code, It seems like clear_cache only supports BPE and Unigram tokenizer, not wordpiece tokenizer, is it the reason? if it is, could anyone give me some advice to fix this problem?
environment:
run on linux with only cpu
tokenizers==0.21.0
transformers==4.49.0
The text was updated successfully, but these errors were encountered: