-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
trtexec failure of TensorRT 8.6.1.6 when running trtexec on GPU RTX4090 #3590
Comments
Your onnx is invalid, it failed with onnxruntime
|
I used the torch.export function to export onnx and use trtexec to generate engine. I used the The code for python inference is as follows.
|
Your new model pass with TRT 9.2
|
I was able to output normally using tensorrt in 8.6, but an error occurred while calling the python api for inference. The details are in the third picture. |
Hey, which Object Detection model are you using? |
You can use trt v10 to try it. |
Environment
TensorRT Version:
NVIDIA GPU: RTX 4090
NVIDIA Driver Version: 535.129.03
CUDA Version: 11.8
CUDNN Version: 8.9.6.50
Operating System: ubuntu 22.04
Python Version (if applicable): 3.9
Tensorflow Version (if applicable):
PyTorch Version (if applicable): 1.13
Baremetal or Container (if so, version):
Relevant Files
Model link: https://www.dropbox.com/scl/fi/vzgb4iew1lvj64h6adnjt/model.onnx?rlkey=vqq56hc2t91r7b1m078ks7ycl&dl=0
Steps To Reproduce
I want to transform onnx to engine by using code
trtexec --onnx=model.onnx --saveEngine=model.trt --verbose
. Then an error was reported./MaxPool: at least 5 dimensions are required for input.
But I don't know why! The maxpool that reported the error is used here.
The text was updated successfully, but these errors were encountered: