Skip to content

Commit

Permalink
Added support for Tapas Model (#520)
Browse files Browse the repository at this point in the history
* Added support for Tapas Model

* Added support for Tapas Model

* reformatted files with black

* Update tests/bettertransformer/test_bettertransformer_encoder.py

test_better_encoder

Co-authored-by: Michael Benayoun <[email protected]>

* Update optimum/bettertransformer/models/encoder_models.py

Co-authored-by: Michael Benayoun <[email protected]>

* Update tests/bettertransformer/test_bettertransformer_encoder.py

Co-authored-by: Sylvain Gugger <[email protected]>

* Styled optimum files

* Update optimum/bettertransformer/models/encoder_models.py

Co-authored-by: fxmarty <[email protected]>

* Update optimum/bettertransformer/models/encoder_models.py

Call super() in the _init() to inherit from BetterTransformerBaseLayer

Co-authored-by: fxmarty <[email protected]>

* Update tests/bettertransformer/test_bettertransformer_encoder.py

Co-authored-by: Younes Belkada <[email protected]>

* Moved Tapas Encoder model to Encoder

* change mapping in __init_.py

* deleted

* Update __init__.py

* Update __init__.py

* refactor doc + remove class + styling

Co-authored-by: Michael Benayoun <[email protected]>
Co-authored-by: Sylvain Gugger <[email protected]>
Co-authored-by: fxmarty <[email protected]>
Co-authored-by: Younes Belkada <[email protected]>
Co-authored-by: younesbelkada <[email protected]>
  • Loading branch information
6 people authored Nov 30, 2022
1 parent 6ee424b commit d33801d
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 1 deletion.
3 changes: 2 additions & 1 deletion docs/source/bettertransformer/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ specific language governing permissions and limitations under the License.

🤗 Optimum provides an integration with `BetterTransformer`, a stable API from PyTorch to benefit from interesting speedups on CPU & GPU through sparsity and fused kernels.


## Quickstart

Since its 1.13 version, PyTorch released the stable version of `BetterTransformer` in its library. You can benefit from interesting speedup on most consumer-type devices, including CPUs, older and newer versions of NIVIDIA GPUs.
Expand All @@ -23,6 +22,7 @@ You can now use this feature in 🤗 Optimum together with Transformers and use
### Supported models

The list of supported model below:

- [AlBERT](https://arxiv.org/abs/1909.11942)
- [BART](https://arxiv.org/abs/1910.13461)
- [BERT](https://arxiv.org/abs/1810.04805)
Expand All @@ -41,6 +41,7 @@ The list of supported model below:
- [M2M100](https://arxiv.org/abs/2010.11125)
- [RoBERTa](https://arxiv.org/abs/1907.11692)
- [Splinter](https://arxiv.org/abs/2101.00438)
- [Tapas](https://arxiv.org/abs/2211.06550)
- [ViLT](https://arxiv.org/abs/2102.03334)
- [ViT](https://arxiv.org/abs/2010.11929)
- [ViT-MAE](https://arxiv.org/abs/2111.06377)
Expand Down
1 change: 1 addition & 0 deletions optimum/bettertransformer/models/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@

BETTER_TRANFORMER_LAYERS_MAPPING_DICT = {
# Bert Family
"TapasLayer": BertLayerBetterTransformer,
"BertLayer": BertLayerBetterTransformer,
"ElectraLayer": BertLayerBetterTransformer,
"Data2VecTextLayer": BertLayerBetterTransformer,
Expand Down
1 change: 1 addition & 0 deletions tests/bettertransformer/test_bettertransformer_encoder.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@
"hf-internal-testing/tiny-random-MarkupLMModel",
"hf-internal-testing/tiny-random-BertModel",
"ybelkada/random-tiny-BertGenerationModel",
"hf-internal-testing/tiny-random-TapasModel",
]

ALL_ENCODER_DECODER_MODELS_TO_TEST = [
Expand Down

0 comments on commit d33801d

Please sign in to comment.