You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
No matter which signature the constructor uses, the input tensors will be flattened and stored in inputs attribute of an Operator (with a type of List[Tensor]). When we need to reuse the saved inputs, it's hard to deal with different signatures separately. Maybe it's better to use a unified signature, like Op(List[Tensor])?
The text was updated successfully, but these errors were encountered:
Add some necessary module components used frequently in Stable
Diffusion's UNet.
Includes fixes to module attribute access from LLM branch and work
arounds for torch weight copying.
Towards #57.
Currently, operators have different constructor signatures according to different numbers of input tensors. For example:
Which causes a problem in
hidet/python/hidet/graph/operator.py
Line 130 in 80a35d6
No matter which signature the constructor uses, the input tensors will be flattened and stored in
inputs
attribute of an Operator (with a type of List[Tensor]). When we need to reuse the saved inputs, it's hard to deal with different signatures separately. Maybe it's better to use a unified signature, likeOp(List[Tensor])
?The text was updated successfully, but these errors were encountered: