Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Some of the output is None after converting onnx model through quantize_static to int8 quanted onnx model. #14501

Closed
vonJJ opened this issue Jan 31, 2023 · 2 comments
Labels
quantization issues related to quantization

Comments

@vonJJ
Copy link

vonJJ commented Jan 31, 2023

Describe the issue

The original model is trained on tensorflow 2.6.0, and I want to compare run time of different formats of the same model (original tf model m1, onnx model m2 and int8 static quanted onnx model m3).
I converted tf model to onnx model (m2) firstly and the whole process is smooth, m2 performance well.
Then I converted m2 to m3 use quantize_static in onnxruntime.quantization , and no bug exist during the conver process, but when I carried out infererence on m3 use sesson.run(), I found that the first 2 of outputs (there are totally 7 output vectors) is 'None'. (the output name and shape is still right, only the value is None)

To reproduce

import onnxruntime as ort
import numpy as np
import tensorflow as tf

model_path = 'static_quant.onnx'

ort_session = ort.InferenceSession(model_path)
input_feature_name_1 = ort_session.get_inputs()[0].name
input_feature_name_2 = ort_session.get_inputs()[1].name
input_feature_name_3 = ort_session.get_inputs()[2].name
feature1 = np.random.rand(1, 26, 26, 17).astype(np.float32)
feature2 = np.random.rand(1, 31, 31, 27).astype(np.float32)
feature3 = np.random.rand(1, 136).astype(np.float32)
feed_dict = {input_feature_name_1: feature1.astype(np.float32), input_feature_name_2: feature2.astype(np.float32), input_feature_name_3: feature3.astype(np.float32)}
result_prob = ort_session.run([], input_feed=feed_dict)
print(result_prob)

output:
[None, None, array([[ 23.827963, -643.355 , -2478.1082 , -3455.0547 ,
-2478.1082 , -2025.3768 , 833.9787 , -1572.6455 ]],
dtype=float32), array([[ -53.13701 , -39.852757, -46.49488 , -59.779137,
-79.70551 , -86.34764 , -79.70551 , -13.284252,
-26.568504, -39.852757, -33.210632, -33.210632,
-59.779137, -66.421265, -73.063385, -66.421265,
-92.98976 , -92.98976 , -92.98976 , -112.916145,
-106.27402 , -92.98976 , -112.916145, -99.63189 ,
-1554.2574 ]], dtype=float32), array([[-1600.4183, -2796.993 , -2557.678 , -2991.4363, -2497.849 ,
-2557.678 , -2497.849 , -2482.892 , -2497.849 , -2363.2346,
-2423.0632, -2393.149 , -2423.0632, -2378.192 , -2408.1062,
-2258.5344, -2079.048 , -2138.877 , -2079.048 , -2019.2195,
-2019.2195, -2049.1338, -2019.2195, -2079.048 , -2183.7485,
-2153.834 , -2228.6199, -2452.9778, -2168.7913, -2408.1062,
-2273.4915, -2452.9778, -2438.0205, -2737.164 , -2632.4639,
-2707.2498, -2841.8645, -2886.7358, -3021.3506, -3006.3933,
-3051.265 , -3081.1792, -2991.4363, -3111.0938, -3111.0938,
-3021.3506, -3036.3076, -2976.479 , -3141.008 , -3081.1792,
-2991.4363, -3006.3933, -2886.7358, -2886.7358, -2841.8645,
-2931.6074, -2901.693 , -3245.7083, -2796.993 , -2886.7358,
-2841.8645, -2931.6074, -2662.3782, -3006.3933, -2692.2925,
-2722.207 , -2722.207 , -2707.2498, -2707.2498, -2752.1213,
-2796.993 , -2767.0784, -2677.3354, -2647.4211, -2647.4211,
-2632.4639, -2542.7207, -2423.0632, -2348.2773, -2303.4058,
-2318.363 , -2408.1062, -2288.4487, -2363.2346, -2423.0632,
-2482.892 , -2348.2773, -2662.3782, -2423.0632, -2767.0784,
-2662.3782, -2647.4211, -2512.8064, -2931.6074, -2602.5496,
-2467.9348, -2602.5496, -2572.635 , -2602.5496, -2587.5923,
-2677.3354, -2632.4639, -2617.5066, -2647.4211, -2512.8064,
-2527.7637, -2542.7207, -2467.9348, -2497.849 , -2617.5066,
-2632.4639, -2692.2925, -2632.4639, -2677.3354, -2826.9072,
-2886.7358, -2841.8645, -3036.3076, -2617.5066, -2737.164 ]],
dtype=float32), array([[ -36489.945, -24326.629, -81737.48 , -80277.875, -51085.92 ,
-45734.062, -44761. , -68114.56 , -108496.766, -64708.836,
-86602.805, -22867.031, -97306.516, -47680.195, -42814.867,
-46707.13 , -49139.793, -41841.805, -35516.88 , -36976.477,
-91468.125, -70060.695, -87089.336, -49626.324, -44761. ,
-63735.77 , -68601.09 , -69574.164, -61303.105, -61303.105,
-32597.684, -32597.684, -102658.375, -77845.22 , -41355.27 ,
-66654.97 , -69087.625, -60330.043, -63735.77 , -67141.5 ,
-68601.09 , -54005.117, -29678.488, -60330.043, -47193.66 ,
-42328.336, -65195.367, -67141.5 , -60816.574, -54978.184,
-59356.977, -63735.77 , -62762.703, -34543.812, -39895.67 ,
-39895.67 , -44761. , -60330.043, -58383.91 , -53518.586,
-48166.727, -52545.52 , -58870.445, -68601.09 , -43787.934,
-38436.074, -43301.402, -38436.074, -59356.977, -62762.703,
-56437.78 , -54491.65 , -63735.77 , -57410.848, -67628.03 ,
-42328.336, -48166.727, -59843.508, -31624.62 , -54005.117,
-59356.977, -63735.77 , -63735.77 , -68601.09 , -61789.64 ,
-64708.836, -39895.67 , -75899.086, -97793.055, -34543.812,
-31624.62 , -52058.99 , -55464.715, -61789.64 , -69574.164,
-66654.97 , -43787.934, -47680.195, -85143.2 , -46220.598,
-91468.125, -36003.41 , -35516.88 , -40868.74 , -46707.13 ,
-45734.062, -43787.934, -49626.324, -99739.18 , -23840.098,
-102658.375, -59356.977, -104604.51 , -71520.29 , -46220.598,
-43787.934, -51085.92 , -81250.945, -91954.66 , -25299.695,
-36489.945]], dtype=float32), array([[-3733.8992, -3802.2021, -3733.8992, -4212.0205, -4052.6465,
-4029.879 , -3551.7576, -3574.5254, -3893.2727, -3665.596 ,
-3756.6667, -3711.1313, -3733.8992, -3620.0608, -4348.6265,
-3756.6667, -3506.2224, -3324.0808, -3369.6162, -3028.101 ,
-3597.293 , -3756.6667, -3711.1313, -3711.1313, -4986.121 ,
-3893.2727, -3164.707 , -3415.1516, -4098.182 , -4029.879 ,
-3756.6667, -2914.2627, -3984.3435, -4325.859 , -4530.7676,
-4325.859 , -3210.2424, -3437.9192, -3597.293 , -3620.0608,
-2914.2627, -3460.687 , -2595.5151, -2276.7678, -4758.4443,
-4075.4143, -3597.293 , -3733.8992, -4530.7676, -4371.394 ,
-3916.0405, -4303.091 , -4212.0205, -3278.5454, -2527.2122,
-4189.2524, -3893.2727, -4052.6465, -3756.6667, -4098.182 ,
-3688.3638, -4212.0205, -3278.5454, -3688.3638, -3642.8284,
-2868.7273, -4007.1113, -4166.485 , -4462.465 , -3551.7576,
-4075.4143, -4371.394 , -4530.7676, -4576.303 , -3620.0608,
-3255.7778, -2641.0505, -3984.3435, -4052.6465, -3756.6667,
-3096.404 , -3620.0608, -3597.293 , -3119.1719, -4120.9497,
-3210.2424, -2504.4446, -2481.6768, -4371.394 , -3233.0103,
-3938.808 , -3210.2424, -3688.3638, -3733.8992, -3642.8284,
-3278.5454, -3119.1719, -2458.9092, -3210.2424, -3688.3638,
-3779.4343, -2709.3535, -2458.9092, -2777.6567, -3141.9395,
-2845.9597, -2436.1414, -2185.697 , -2572.7476, -3528.99 ,
-3756.6667, -3756.6667, -3642.8284, -2549.98 , -2299.5354,
-2663.8184, -2413.3738, -2322.303 , -2959.798 , -3574.5254,
-3756.6667]], dtype=float32)]

m3(the_static_quanted_onnx_model).zip

Urgency

To be solved within 2 days better.

Platform

Linux

OS Version

gcc version 9.4.0 (Ubuntu 9.4.0-1ubuntu1~20.04.1

ONNX Runtime Installation

Released Package

ONNX Runtime Version or Commit ID

1.13.1

ONNX Runtime API

Python

Architecture

X86

Execution Provider

Default CPU

Execution Provider Library Version

Intel(R) Core(TM) i7-9700K CPU @ 3.60GHz

@github-actions github-actions bot added the quantization issues related to quantization label Jan 31, 2023
@vonJJ vonJJ changed the title Some of the output is None after converting onnx model to int8 quanted onnx model. Some of the output is None after converting onnx model through quantize_static to int8 quanted onnx model. Jan 31, 2023
@RyanUnderhill
Copy link
Member

@vonJJ Looks like the fix is submitted, are you able to verify the fix against the main branch?

@vonJJ
Copy link
Author

vonJJ commented Feb 2, 2023

@vonJJ Looks like the fix is submitted, are you able to verify the fix against the main branch?

yes, the problem has been solved!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
quantization issues related to quantization
Projects
None yet
Development

No branches or pull requests

2 participants