Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle flash_attn version update in transformers main #643

Closed
wants to merge 5 commits into from

Commits on Oct 3, 2023

  1. Fix llama attn

    irenedea committed Oct 3, 2023
    Configuration menu
    Copy the full SHA
    ad6fd55 View commit details
    Browse the repository at this point in the history

Commits on Oct 9, 2023

  1. Configuration menu
    Copy the full SHA
    c3ad4b4 View commit details
    Browse the repository at this point in the history

Commits on Oct 10, 2023

  1. Configuration menu
    Copy the full SHA
    5d667f5 View commit details
    Browse the repository at this point in the history
  2. Configuration menu
    Copy the full SHA
    4becd16 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    b71ea7d View commit details
    Browse the repository at this point in the history