You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Tests fail with onnxruntime 1.12.1 built from source. Seems like InferenceSession needds to be instantiated with providers.
======================================================================
ERROR: test_auto_mixed_precision (test_auto_mixed_precision.AutoFloat16Test)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_auto_mixed_precision.py", line 43, in test_auto_mixed_precision
expected = transpose_n_matmul(m1)
File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
return Graph.inference_runtime(self.oxml, kwargs)
File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
sess = _ort.InferenceSession(mdl.SerializeToString())
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
======================================================================
ERROR: test_auto_mixed_precision_model_path (test_auto_mixed_precision_model_path.AutoFloat16Test)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_auto_mixed_precision_model_path.py", line 32, in test_auto_mixed_precision_model_path
expected = _ort_inference(model32_path, {'modelInput': input_x})
File "/build/source/tests/test_auto_mixed_precision_model_path.py", line 13, in _ort_inference
sess = _ort.InferenceSession(model_path)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
======================================================================
ERROR: test_float16 (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: test_float16
Traceback (most recent call last):
File "/nix/store/sz0j8k8ljh7y8qgyfxgqb3ws11bcy4gs-python3-3.10.6/lib/python3.10/unittest/loader.py", line 436, in _find_test_path
module = self._get_module_from_name(name)
File "/nix/store/sz0j8k8ljh7y8qgyfxgqb3ws11bcy4gs-python3-3.10.6/lib/python3.10/unittest/loader.py", line 377, in _get_module_from_name
__import__(name)
File "/build/source/tests/test_float16.py", line 7, in <module>
import onnxmltools
ModuleNotFoundError: No module named 'onnxmltools'
======================================================================
ERROR: test_onnx2py (test_onnx2py.Onnx2PyTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_onnx2py.py", line 27, in test_onnx2py
sess1 = _ort.InferenceSession(onnx_model.SerializeToString())
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
======================================================================
ERROR: test_onnx2py (test_onnx2py.Onnx2PyTests)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_onnx2py.py", line 15, in tearDown
for f in os.listdir(tmp_path):
FileNotFoundError: [Errno 2] No such file or directory: '/build/source/tests/temp'
======================================================================
ERROR: test_core (test_onnxfx.ONNXFunctionTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_onnxfx.py", line 34, in test_core
np.allclose(g([2.0], [-5.0]), np.array([2.0])))
File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
return Graph.inference_runtime(self.oxml, kwargs)
File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
sess = _ort.InferenceSession(mdl.SerializeToString())
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
======================================================================
ERROR: test_loop (test_onnxfx.ONNXFunctionTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_onnxfx.py", line 58, in test_loop
loop_test(np.array([16], dtype=np.int64))[2][4], 3.0)
File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
return Graph.inference_runtime(self.oxml, kwargs)
File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
sess = _ort.InferenceSession(mdl.SerializeToString())
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
======================================================================
ERROR: test_matmul_opt (test_onnxfx.ONNXFunctionTest)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/build/source/tests/test_onnxfx.py", line 75, in test_matmul_opt
expected = transpose_n_matmul(m1)
File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
return Graph.inference_runtime(self.oxml, kwargs)
File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
sess = _ort.InferenceSession(mdl.SerializeToString())
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)
----------------------------------------------------------------------
Ran 27 tests in 1.060s
When needed I can provide the exact set of packages/build parameters as this was built using Nix.
The text was updated successfully, but these errors were encountered:
Tests fail with onnxruntime 1.12.1 built from source. Seems like InferenceSession needds to be instantiated with providers.
When needed I can provide the exact set of packages/build parameters as this was built using Nix.
The text was updated successfully, but these errors were encountered: