Skip to content

Commit

Permalink
Add flex_attn to diffllama (#35601)
Browse files Browse the repository at this point in the history
Add sdpa to diffllama
  • Loading branch information
muellerzr authored Jan 9, 2025
1 parent 1e3ddcb commit 8de7b1b
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions src/transformers/models/diffllama/modeling_diffllama.py
Original file line number Diff line number Diff line change
Expand Up @@ -595,6 +595,7 @@ class DiffLlamaPreTrainedModel(PreTrainedModel):
_skip_keys_device_placement = ["past_key_values"]
_supports_flash_attn_2 = True
_supports_sdpa = True
_supports_flex_attn = True
_supports_cache_class = True
_supports_quantized_cache = True
_supports_static_cache = True
Expand Down

0 comments on commit 8de7b1b

Please sign in to comment.