You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
closing since there is already solution to infer shapes with onnxmodels larger than 2Gb, for the remaining plugin node issue, could you ask in the onnxruntime, there are more onnxruntime experts, thanks!
Description
I have a UNet larger than 2GB. When I use the TensorRT Diffusion demo to optimize the onnx, it produce error when running
opt.infer_shapes
.Is there any solution for this case?
Relevant Files
https://github.com/NVIDIA/TensorRT/tree/release/8.5/demo/Diffusion
The text was updated successfully, but these errors were encountered: