You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Metadata-Version: 2.1
Name: flash-attn
Version: 2.3.6
Summary: Flash Attention: Fast and Memory-Efficient Exact Attention
Home-page: https://github.com/Dao-AILab/flash-attention
Author: Tri Dao
Author-email: [email protected]
Classifier: Programming Language :: Python :: 3
Classifier: License :: OSI Approved :: BSD License
Classifier: Operating System :: Unix
it would be useful to encode the local version in wheel metadata as well and I believe the only thing needed is to add FLASH_ATTN_LOCAL_VERSION env var into GH action flow.
Would be happy to raise a tentative PR if this is something you agree upon. Thanks!
The text was updated successfully, but these errors were encountered:
Hi team,
This is an issue following up this PR: #229
Right now the released wheels, though with local versions available in the wheel's filename, doesn't encode the proper local version in the
METADATA
:This will give
it would be useful to encode the local version in wheel metadata as well and I believe the only thing needed is to add
FLASH_ATTN_LOCAL_VERSION
env var into GH action flow.Would be happy to raise a tentative PR if this is something you agree upon. Thanks!
The text was updated successfully, but these errors were encountered: