Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OnnxRuntime csharp Inference example #462

Closed
Seth-Julien-de-Lampon opened this issue Feb 11, 2019 · 3 comments
Closed

OnnxRuntime csharp Inference example #462

Seth-Julien-de-Lampon opened this issue Feb 11, 2019 · 3 comments
Assignees

Comments

@Seth-Julien-de-Lampon
Copy link

Seth-Julien-de-Lampon commented Feb 11, 2019

Describe the bug

Following the steps of the csharp Inference Example I'm trying to implement the ArcFace model.
Doing the same thing with Emotion FerPlus model does work, however the following error occurs for the ArcFace model and I am clueless:

' [ErrorCode:Fail] c:\agent_work\6\s\onnxruntime\core\providers\cpu\math\element_wise_ops.h:353 onnxruntime::BroadcastIterator::Init axis == 1 || axis == largest was false. Attempting to broadcast an axis by a dimension other than 1. 64 by 112 '

This error happens when I Run a InferenceSession. On line 154 of InferenceSession.cs the status returned is non Zero (64 according to the error message) :
IntPtr status = NativeMethods.OrtRun(
this._nativeHandle,
IntPtr.Zero, // TODO: use Run options when Run options creation API is available
// Passing null uses the default run options in the C-api
inputNames,
inputTensors,
(ulong)(inputTensors.Length), /* TODO: size_t, make it portable for x86 arm /
outputNamesArray,
(ulong)outputNames.Count, /
TODO: size_t, make it portable for x86 and arm /
outputValueArray /
An array of output value pointers. Array must be allocated by the caller */
);

I'm using a Tensor of size [1, 3, 112, 112], with Onnx version 1.3 and Opset version 8.

System information

  • Windows
  • Microsoft.ML.OnnxRunetime version 0.2.1
@snnn
Copy link
Member

snnn commented Feb 11, 2019

Where did you get the model?

@Seth-Julien-de-Lampon
Copy link
Author

@snnn
Copy link
Member

snnn commented Feb 11, 2019

It's a known problem.

onnx/models#91

Close it because it's a ONNX model zoo problem, not a runtime bug.

@snnn snnn closed this as completed Feb 11, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants