Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing tests #242

Closed
FRidh opened this issue Sep 27, 2022 · 3 comments
Closed

Failing tests #242

FRidh opened this issue Sep 27, 2022 · 3 comments

Comments

@FRidh
Copy link

FRidh commented Sep 27, 2022

Tests fail with onnxruntime 1.12.1 built from source. Seems like InferenceSession needds to be instantiated with providers.

======================================================================
ERROR: test_auto_mixed_precision (test_auto_mixed_precision.AutoFloat16Test)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_auto_mixed_precision.py", line 43, in test_auto_mixed_precision
    expected = transpose_n_matmul(m1)
  File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
    return Graph.inference_runtime(self.oxml, kwargs)
  File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
    sess = _ort.InferenceSession(mdl.SerializeToString())
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

======================================================================
ERROR: test_auto_mixed_precision_model_path (test_auto_mixed_precision_model_path.AutoFloat16Test)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_auto_mixed_precision_model_path.py", line 32, in test_auto_mixed_precision_model_path
    expected = _ort_inference(model32_path, {'modelInput': input_x})
  File "/build/source/tests/test_auto_mixed_precision_model_path.py", line 13, in _ort_inference
    sess = _ort.InferenceSession(model_path)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

======================================================================
ERROR: test_float16 (unittest.loader._FailedTest)
----------------------------------------------------------------------
ImportError: Failed to import test module: test_float16
Traceback (most recent call last):
  File "/nix/store/sz0j8k8ljh7y8qgyfxgqb3ws11bcy4gs-python3-3.10.6/lib/python3.10/unittest/loader.py", line 436, in _find_test_path
    module = self._get_module_from_name(name)
  File "/nix/store/sz0j8k8ljh7y8qgyfxgqb3ws11bcy4gs-python3-3.10.6/lib/python3.10/unittest/loader.py", line 377, in _get_module_from_name
    __import__(name)
  File "/build/source/tests/test_float16.py", line 7, in <module>
    import onnxmltools
ModuleNotFoundError: No module named 'onnxmltools'


======================================================================
ERROR: test_onnx2py (test_onnx2py.Onnx2PyTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_onnx2py.py", line 27, in test_onnx2py
    sess1 = _ort.InferenceSession(onnx_model.SerializeToString())
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

======================================================================
ERROR: test_onnx2py (test_onnx2py.Onnx2PyTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_onnx2py.py", line 15, in tearDown
    for f in os.listdir(tmp_path):
FileNotFoundError: [Errno 2] No such file or directory: '/build/source/tests/temp'

======================================================================
ERROR: test_core (test_onnxfx.ONNXFunctionTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_onnxfx.py", line 34, in test_core
    np.allclose(g([2.0], [-5.0]), np.array([2.0])))
  File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
    return Graph.inference_runtime(self.oxml, kwargs)
  File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
    sess = _ort.InferenceSession(mdl.SerializeToString())
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

======================================================================
ERROR: test_loop (test_onnxfx.ONNXFunctionTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_onnxfx.py", line 58, in test_loop
    loop_test(np.array([16], dtype=np.int64))[2][4], 3.0)
  File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
    return Graph.inference_runtime(self.oxml, kwargs)
  File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
    sess = _ort.InferenceSession(mdl.SerializeToString())
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

======================================================================
ERROR: test_matmul_opt (test_onnxfx.ONNXFunctionTest)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/build/source/tests/test_onnxfx.py", line 75, in test_matmul_opt
    expected = transpose_n_matmul(m1)
  File "/build/source/onnxconverter_common/onnx_fx.py", line 274, in __call__
    return Graph.inference_runtime(self.oxml, kwargs)
  File "/build/source/tests/test_onnxfx.py", line 12, in _ort_inference
    sess = _ort.InferenceSession(mdl.SerializeToString())
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/nix/store/vy5fnk87f62mzqaaw3l208psi7lvpfns-onnxruntime-1.12.1-python/lib/python3.10/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 375, in _create_inference_session
    raise ValueError(
ValueError: This ORT build has ['DnnlExecutionProvider', 'CPUExecutionProvider'] enabled. Since ORT 1.9, you are required to explicitly set the providers parameter when instantiating InferenceSession. For example, onnxruntime.InferenceSession(..., providers=['DnnlExecutionProvider', 'CPUExecutionProvider'], ...)

----------------------------------------------------------------------
Ran 27 tests in 1.060s

When needed I can provide the exact set of packages/build parameters as this was built using Nix.

@xiaowuhu
Copy link
Collaborator

xiaowuhu commented Nov 3, 2022

@FRidh Hi, this issue still bother you? We don't have official testing on Nix.

@xiaowuhu
Copy link
Collaborator

xiaowuhu commented Nov 4, 2022

if this still bother you, you can reopen it.

@xiaowuhu xiaowuhu closed this as completed Nov 4, 2022
@tobim
Copy link

tobim commented Oct 17, 2023

This is still an issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants