-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug][Relay] fix relay frontend pytorch op addmm bug #15294
Conversation
fix relay frontend pytorch op: addmm calculation formula error. bug: out = input + alpha * beta * mat1 @ mat2 fix bug: out = beta * input + alpha * mat1 @ mat2
Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment. Generated by tvm-bot |
Please add a regression test for it |
ok |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for your PR, LGTM!
Please correct the code style according to the CI error: https://ci.tlcpack.ai/blue/organizations/jenkins/tvm-lint/detail/PR-15294/4/pipeline
https://ci.tlcpack.ai/blue/organizations/jenkins/tvm-lint/detail/PR-15294/5/artifacts I don't know what the reason for his mistake is, and I don't know how to modify it。 |
|
Change the code format as follows:
|
@jikechao 3Q |
fix relay frontend pytorch op: addmm
calculation formula error.
bug:
out = input + alpha * beta * mat1 @ mat2
fix bug:
out = beta * input + alpha * mat1 @ mat2