[Bug] Trying to import BERT Transformer onto TVM #17480
Labels
needs-triage
PRs or issues that need to be investigated by maintainers to find the right assignees to address it
type: bug
Expected behavior
I'm currently trying to import the BERT Transformer onto TVM but I'm running into issues with the tvm.relay.from_pytorch function.
Actual behavior
Line of code causing the issue:
Environment
Linux OS, TVM version 0.19.dev (built from source), PyTorch version 2.4.0+cu121, Transformers version 4.39.3, CUDA version 12.4
Steps to reproduce
Here's the initial code that I was trying to use that I found online:
And here is the error message that I am receiving:
Any help would be greatly appreciated. Thanks!
Triage
Please refer to the list of label tags here to find the relevant tags and add them below in a bullet format (example below).
I think that's the right tag, but I'm not sure.
The text was updated successfully, but these errors were encountered: