Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to edit onnx larger than 2GB with onnx_graphsurgeon? #2802

Closed
1049451037 opened this issue Mar 23, 2023 · 3 comments
Closed

How to edit onnx larger than 2GB with onnx_graphsurgeon? #2802

1049451037 opened this issue Mar 23, 2023 · 3 comments
Labels
Demo: Diffusion Issues regarding demoDiffusion triaged Issue has been triaged by maintainers

Comments

@1049451037
Copy link

1049451037 commented Mar 23, 2023

Description

I have a UNet larger than 2GB. When I use the TensorRT Diffusion demo to optimize the onnx, it produce error when running opt.infer_shapes.

ValueError: Message onnx.ModelProto exceeds maximum protobuf size of 2GB: 2602836132

Is there any solution for this case?

Relevant Files

https://github.com/NVIDIA/TensorRT/tree/release/8.5/demo/Diffusion

@1049451037 1049451037 reopened this Mar 23, 2023
@rajeevsrao rajeevsrao added the Demo: Diffusion Issues regarding demoDiffusion label Mar 23, 2023
@rajeevsrao
Copy link
Collaborator

For ONNX models larger than 2GB, you could use the infer_shapes_path API to write the inferred model to file. Ex:

if onnx_graph.ByteSize() > 2147483648:
    onnx.shape_inference.infer_shapes_path(self.onnx_path, self.onnx_opt_path)
    onnx_graph = onnx.load(self.onnx_opt_path)
else:
    onnx_graph = onnx.shape_inference.infer_shapes(onnx_graph)

@rajeevsrao rajeevsrao added the triaged Issue has been triaged by maintainers label Mar 23, 2023
@1049451037
Copy link
Author

Moreover, I have an onnx that is not able to inference shape by onnxruntime... (There is a plugin node in the onnx file...) Is there any workaround?

@ttyio
Copy link
Collaborator

ttyio commented Jun 6, 2023

closing since there is already solution to infer shapes with onnxmodels larger than 2Gb, for the remaining plugin node issue, could you ask in the onnxruntime, there are more onnxruntime experts, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Demo: Diffusion Issues regarding demoDiffusion triaged Issue has been triaged by maintainers
Projects
None yet
Development

No branches or pull requests

3 participants