Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

React Native: Can't load a model: Can't create InferenceSession #10302

Open
Shahzaib-MST opened this issue Jan 17, 2022 · 3 comments
Open

React Native: Can't load a model: Can't create InferenceSession #10302

Shahzaib-MST opened this issue Jan 17, 2022 · 3 comments
Assignees
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template

Comments

@Shahzaib-MST
Copy link

I have converted the Keras model to onnx using onnxmltools on python. the summary of the model is as below:
Model: "sequential"


Layer (type) Output Shape Param #

dense (Dense) (None, 32) 480


dense_1 (Dense) (None, 64) 2112


dense_2 (Dense) (None, 128) 8320


dense_3 (Dense) (None, 64) 8256


dropout (Dropout) (None, 64) 0


dense_4 (Dense) (None, 5) 325

Total params: 19,493
Trainable params: 19,493
Non-trainable params: 0

The converted model is working fine when used on python. In order to use the onnx model in react native, I converted the onnx model to ort format using the script:
python -m onnxruntime.tools.convert_onnx_models_to_ort model.onnx

The converted ORT model works perfectly fine on python but when I use this model in react native it throws exceptions while loading the model
the exceptions is:
[Error: Can't load a model: Can't create InferenceSession]

@Shahzaib-MST Shahzaib-MST changed the title Can't load a model: Can't create InferenceSession React Native: Can't load a model: Can't create InferenceSession Jan 17, 2022
@hanbitmyths
Copy link
Contributor

ONNX Runtime react native packages supports a limited set of operators in here. Can you check if all kernels in your model are supported and opset version is either 12 or 13?

If your model is not using opset 12/13, you can either export an ONNX model with opset 12/13 or you can convert onnx version using this script. Then, convert to ort.

@hanbitmyths hanbitmyths self-assigned this Jan 18, 2022
@Shahzaib-MST
Copy link
Author

Hi,
Yes, I have checked the required operators of my model are listed in the set of operators list

@stale
Copy link

stale bot commented Apr 16, 2022

This issue has been automatically marked as stale due to inactivity and will be closed in 7 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@stale stale bot added the stale issues that have not been addressed in a while; categorized by a bot label Apr 16, 2022
@sophies927 sophies927 added platform:web issues related to ONNX Runtime web; typically submitted using template and removed component:ort-web labels Aug 12, 2022
@stale stale bot removed the stale issues that have not been addressed in a while; categorized by a bot label Aug 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:web issues related to ONNX Runtime web; typically submitted using template
Projects
None yet
Development

No branches or pull requests

4 participants