You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have converted the Keras model to onnx using onnxmltools on python. the summary of the model is as below:
Model: "sequential"
Layer (type) Output Shape Param #
dense (Dense) (None, 32) 480
dense_1 (Dense) (None, 64) 2112
dense_2 (Dense) (None, 128) 8320
dense_3 (Dense) (None, 64) 8256
dropout (Dropout) (None, 64) 0
dense_4 (Dense) (None, 5) 325
Total params: 19,493
Trainable params: 19,493
Non-trainable params: 0
The converted model is working fine when used on python. In order to use the onnx model in react native, I converted the onnx model to ort format using the script: python -m onnxruntime.tools.convert_onnx_models_to_ort model.onnx
The converted ORT model works perfectly fine on python but when I use this model in react native it throws exceptions while loading the model
the exceptions is: [Error: Can't load a model: Can't create InferenceSession]
The text was updated successfully, but these errors were encountered:
Shahzaib-MST
changed the title
Can't load a model: Can't create InferenceSession
React Native: Can't load a model: Can't create InferenceSession
Jan 17, 2022
ONNX Runtime react native packages supports a limited set of operators in here. Can you check if all kernels in your model are supported and opset version is either 12 or 13?
If your model is not using opset 12/13, you can either export an ONNX model with opset 12/13 or you can convert onnx version using this script. Then, convert to ort.
This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.
stalebot
added
the
stale
issues that have not been addressed in a while; categorized by a bot
label
Apr 16, 2022
I have converted the Keras model to onnx using onnxmltools on python. the summary of the model is as below:
Model: "sequential"
Layer (type) Output Shape Param #
dense (Dense) (None, 32) 480
dense_1 (Dense) (None, 64) 2112
dense_2 (Dense) (None, 128) 8320
dense_3 (Dense) (None, 64) 8256
dropout (Dropout) (None, 64) 0
dense_4 (Dense) (None, 5) 325
Total params: 19,493
Trainable params: 19,493
Non-trainable params: 0
The converted model is working fine when used on python. In order to use the onnx model in react native, I converted the onnx model to ort format using the script:
python -m onnxruntime.tools.convert_onnx_models_to_ort model.onnx
The converted ORT model works perfectly fine on python but when I use this model in react native it throws exceptions while loading the model
the exceptions is:
[Error: Can't load a model: Can't create InferenceSession]
The text was updated successfully, but these errors were encountered: