Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[MLIR][TORCH] Add E2E support for prims.convert_element_type op #1619

Merged

Conversation

vivekkhandelwal1
Copy link
Collaborator

Signed-Off By: Vivek Khandelwal[email protected]

lib/Dialect/Torch/IR/TorchOps.cpp Outdated Show resolved Hide resolved
Copy link
Member

@pashu123 pashu123 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@silvasean
Copy link
Contributor

Hey Vivek, can you give more context on where you are seeing all these prims ops from? I wasn't aware that upstream was using those ops yet.

@vivekkhandelwal1
Copy link
Collaborator Author

Hey Vivek, can you give more context on where you are seeing all these prims ops from? I wasn't aware that upstream was using those ops yet.

I'm seeing these ops during the stable diffusion backward graph generation. Even for the DistilGPT2 model, this op has appeared in the IR.

@tanyokwok
Copy link
Collaborator

tanyokwok commented Nov 21, 2022

Hey Vivek, can you give more context on where you are seeing all these prims ops from? I wasn't aware that upstream was using those ops yet.

I found those prims ops from the bert training with amp enable as well.

@vivekkhandelwal1 vivekkhandelwal1 merged commit 68f568b into llvm:main Nov 22, 2022
@vivekkhandelwal1 vivekkhandelwal1 deleted the prims-convert_element_type branch November 22, 2022 04:06
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants