-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
error using C# tensorRT EP builded from source #8367
Comments
Seems that one of the binaries that should be loaded (and there are many of them) got corrupted. I see that you |
+@chilo-ms for assistance |
running tests will occured error, part of the error information as follows: ... 2: Test command: E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnx_test_runner.exe "E:/onnxruntime1.81_TensorRT/cmake/external/onnx/onnx/backend/test/data/pytorch-converted" 3: Test command: E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnx_test_runner.exe "E:/onnxruntime1.81_TensorRT/cmake/external/onnx/onnx/backend/test/data/pytorch-operator" 4: Test command: E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnxruntime_shared_lib_test.exe "--gtest_output=xml:E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnxruntime_shared_lib_test.exe.Release.results.xml" 5: Test command: E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnxruntime_global_thread_pools_test.exe "--gtest_output=xml:E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnxruntime_global_thread_pools_test.exe.Release.results.xml" 6: Test command: E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnxruntime_api_tests_without_env.exe "--gtest_output=xml:E:\onnxruntime1.81_TensorRT\build\Windows\Release\Release\onnxruntime_api_tests_without_env.exe.Release.results.xml" 83% tests passed, 1 tests failed out of 6 Total Test time (real) = 254.36 sec The following tests FAILED: |
Could you make sure that both nuget (DLLs) and your application were built for the same architecture? for example, both should be 64-bits or 32-bits. I would suggest using 64-bits as well as changing to vs2019 since we have tested it under this system environment. |
Thank you for your suggestion, I set the architecture of my application to 64-bits, the above error disappear,but another error occured as below: 2021-07-14 16:57:28.0429351 [E:onnxruntime:CSharpOnnxRuntime, tensorrt_execution_provider.h:51 onnxruntime::TensorrtLogger::log] [2021-07-14 08:57:28 ERROR] INVALID_ARGUMENT: getPluginCreator could not find plugin ScatterND version 1 Unhandled Exception: System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. |
From this discussion, it seems TensorRT doesn't support ScatterND yet. |
thanks, I use this model exported from yolov5s: UPDATE: 2021-07-15 10:30:19.4742416 [E:onnxruntime:CSharpOnnxRuntime, tensorrt_execution_provider.h:51 onnxruntime::TensorrtLogger::log] [2021-07-15 02:30:19 ERROR] E:\onnxruntime1.81_TensorRT\cmake\external\onnx-tensorrt\onnx2trt_utils.cpp:475: Found unsupported datatype (11) when importing initializer: 1029 Unhandled Exception: System.AccessViolationException: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. |
" CUDA error cudaErrorInvalidDeviceFunction:invalid device function" These failures are not a problem of TensorRT. They were from our onnx runtime cuda execution provider. |
RTX 3090 has CUDA compute capability of 8.6. The number is not in our CMakeLists.txt. I guess it is the reason why you saw these "invalid device function" error, though I don't understand why CUDA didn't do JIT. You may add " --cmake_extra_defines CMAKE_CUDA_ARCHITECTURES=86" to the arguments of build.bat. |
I build the onnxruntime with tensorRT from the master branch using the following command:
.\build.bat --config Release --build_nuget --parallel --build_shared_lib --cudnn_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.0" --cuda_home "C:\Program Files\NVIDIA GPU Computing Toolkit\CUDA\v11.0" --use_tensorrt --tensorrt_home "I:\python-tensorflow-pytorch安装包\TensorRT-7.2.2.3" --cuda_version 11.0 --cmake_generator "Visual Studio 16 2019" --skip_tests
I generate the following two nupkg:
Microsoft.ML.OnnxRuntime.Managed.1.8.0-dev-20210711-0335-b7c9696ac.nupkg
Microsoft.ML.OnnxRuntime.TensorRT.1.8.0-dev-20210711-0335-b7c9696ac.nupkg
I install these two nupkg in visual studio and run the following code:
the following error was occured:
Unhandled Exception: System.TypeInitializationException: The type initializer for 'Microsoft.ML.OnnxRuntime.NativeMethods' threw an exception. ---> System.BadImageFormatException: An attempt was made to load a program with an incorrect format. (Exception from HRESULT: 0x8007000B)
at Microsoft.ML.OnnxRuntime.NativeMethods.OrtGetApiBase()
at Microsoft.ML.OnnxRuntime.NativeMethods..cctor()
--- End of inner exception stack trace ---
at Microsoft.ML.OnnxRuntime.SessionOptions..ctor()
at Microsoft.ML.OnnxRuntime.SessionOptions.MakeSessionOptionWithTensorrtProvider(Int32 deviceId)
at ConsoleApp1.Program.Main(String[] args) in D:\DNN_GPU_cuda\ConsoleApp1\ConsoleApp1\Program.cs:line 28
System information
The text was updated successfully, but these errors were encountered: