Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace scaled_dot_product_attention lowering pass with decomposition #3296

Open
wants to merge 8 commits into
base: main
Choose a base branch
from

Commits on Nov 17, 2024

  1. Configuration menu
    Copy the full SHA
    354c2bc View commit details
    Browse the repository at this point in the history

Commits on Nov 19, 2024

  1. Remove unnecessary dropout

    HolyWu committed Nov 19, 2024
    Configuration menu
    Copy the full SHA
    ee457be View commit details
    Browse the repository at this point in the history
  2. Use export to trace the graph

    HolyWu committed Nov 19, 2024
    Configuration menu
    Copy the full SHA
    afc16d6 View commit details
    Browse the repository at this point in the history

Commits on Nov 20, 2024

  1. Add test for dynamic shape

    HolyWu committed Nov 20, 2024
    Configuration menu
    Copy the full SHA
    6e2663e View commit details
    Browse the repository at this point in the history

Commits on Nov 24, 2024

  1. Configuration menu
    Copy the full SHA
    a94f1e1 View commit details
    Browse the repository at this point in the history
  2. Cosmetics

    HolyWu committed Nov 24, 2024
    Configuration menu
    Copy the full SHA
    c3d770c View commit details
    Browse the repository at this point in the history
  3. Remove unused math import

    HolyWu committed Nov 24, 2024
    Configuration menu
    Copy the full SHA
    439039f View commit details
    Browse the repository at this point in the history
  4. torch._dynamo.reset

    HolyWu committed Nov 24, 2024
    Configuration menu
    Copy the full SHA
    34cba83 View commit details
    Browse the repository at this point in the history