-
Notifications
You must be signed in to change notification settings - Fork 65
model.compile is failing when using inceptionV3 (irrespective of weights settings) #119
Comments
Thanks for submitting this issue @rajendra2 |
@rajendra2 I believe it's the conv2d_bn() used in keras.applications.inception_v3. Could you try removing all the params in BatchNormalization like this at line 82:
It works on my end, you may want to re-install keras from source after you changed inception_v3. |
I tried removing all the params from BatchNormalization (line 82 in /usr/local/lib/python3.6/dist-packages/keras/applications/inception_v3.py I am looking forward to next release. Any tentative date? |
@rajendra2 Using the workaround works on my side. Next release on June 22. Meanwhile you can try to clone the project, make the change on Inception, and source install using |
@roywei I just tried the new release (keras-mxnet==2.2.0). I still get the same error when I run the my original code in Jupyter. I also tried removing all the params from BatchNormalization (/usr/local/lib/python3.6/dist-packages/keras_applications/inception_v3.py) but that is not helping. Does my code work in your environment with the pypi 2.2.0 release ? |
@rajendra2 sorry, unfortunately we were not able to change this in 2.2.0. We found out in Keras 2.2.0, the application module was moved out into a separable repo in keras-team. We are not able to make any custom change on that for now. Will investigate a work around to modify applications modules for mxnet. |
root cause found: #136 |
#137 should fix this, pending merging back to master branch |
closing as fixed in #148 |
I am trying following code and it is failing during model.compile. I am using
keras-mxnet==2.1.6.1
mxnet-cu92==1.2.0
ValueError Traceback (most recent call last)
in ()
----> 1 model.compile(loss='categorical_crossentropy', optimizer='adadelta')
/usr/local/lib/python3.6/dist-packages/keras/backend/mxnet_backend.py in compile(self, optimizer, loss, metrics, loss_weights, sample_weight_mode, **kwargs)
4480 default_bucket_key='pred',
4481 context=self._context,
-> 4482 fixed_param_names=self._fixed_weights)
4483 set_model(self)
4484 self.compiled = True
/usr/local/lib/python3.6/dist-packages/mxnet/module/bucketing_module.py in init(self, sym_gen, default_bucket_key, logger, context, work_load_list, fixed_param_names, state_names, group2ctxs, compression_params)
82 _check_input_names(symbol, label_names, "label", False)
83 _check_input_names(symbol, state_names, "state", True)
---> 84 _check_input_names(symbol, fixed_param_names, "fixed_param", True)
85
86 self._compression_params = compression_params
/usr/local/lib/python3.6/dist-packages/mxnet/module/base_module.py in _check_input_names(symbol, names, typename, throw)
50 typename, str(names), name, '\n\t'.join(candidates))
51 if throw:
---> 52 raise ValueError(msg)
53 else:
54 warnings.warn(msg)
ValueError: You created Module with Module(..., fixed_param_names=['batchnorm248_gamma', 'batchnorm342_gamma', 'batchnorm212_gamma', 'batchnorm220_gamma', 'batchnorm252_gamma', 'batchnorm266_gamma', 'batchnorm232_gamma', 'batchnorm356_gamma', 'batchnorm192_gamma', 'batchnorm224_gamma', 'batchnorm204_gamma', 'batchnorm222_gamma', 'batchnorm250_gamma', 'batchnorm314_gamma', 'batchnorm270_gamma', 'batchnorm254_gamma', 'batchnorm304_gamma', 'batchnorm236_gamma', 'batchnorm272_gamma', 'batchnorm202_gamma', 'batchnorm298_gamma', 'batchnorm326_gamma', 'batchnorm216_gamma', 'batchnorm288_gamma', 'batchnorm346_gamma', 'batchnorm366_gamma', 'batchnorm300_gamma', 'batchnorm336_gamma', 'batchnorm260_gamma', 'batchnorm208_gamma', 'batchnorm350_gamma', 'batchnorm338_gamma', 'batchnorm258_gamma', 'batchnorm200_gamma', 'batchnorm296_gamma', 'batchnorm284_gamma', 'batchnorm280_gamma', 'batchnorm334_gamma', 'batchnorm322_gamma', 'batchnorm196_gamma', 'batchnorm262_gamma', 'batchnorm240_gamma', 'batchnorm308_gamma', 'batchnorm306_gamma', 'batchnorm206_gamma', 'batchnorm276_gamma', 'batchnorm190_gamma', 'batchnorm344_gamma', 'batchnorm234_gamma', 'batchnorm268_gamma', 'batchnorm242_gamma', 'batchnorm218_gamma', 'batchnorm274_gamma', 'batchnorm320_gamma', 'batchnorm290_gamma', 'batchnorm278_gamma', 'batchnorm286_gamma', 'batchnorm362_gamma', 'batchnorm256_gamma', 'batchnorm316_gamma', 'batchnorm244_gamma', 'batchnorm246_gamma', 'batchnorm340_gamma', 'batchnorm328_gamma', 'batchnorm374_gamma', 'batchnorm194_gamma', 'batchnorm368_gamma', 'batchnorm358_gamma', 'batchnorm302_gamma', 'batchnorm312_gamma', 'batchnorm332_gamma', 'batchnorm370_gamma', 'batchnorm364_gamma', 'batchnorm188_gamma', 'batchnorm324_gamma', 'batchnorm238_gamma', 'batchnorm292_gamma', 'batchnorm330_gamma', 'batchnorm294_gamma', 'batchnorm230_gamma', 'batchnorm354_gamma', 'batchnorm226_gamma', 'batchnorm228_gamma', 'batchnorm372_gamma', 'batchnorm348_gamma', 'batchnorm282_gamma', 'batchnorm214_gamma', 'batchnorm264_gamma', 'batchnorm198_gamma', 'batchnorm318_gamma', 'batchnorm352_gamma', 'batchnorm210_gamma', 'batchnorm360_gamma', 'batchnorm310_gamma']) but input with name 'batchnorm248_gamma' is not found in symbol.list_arguments(). Did you mean one of:
/Input_11
conv2d_1/kernel1
batch_normalization_1/beta1
conv2d_2/kernel1
batch_normalization_2/beta1
conv2d_3/kernel1
batch_normalization_3/beta1
conv2d_4/kernel1
batch_normalization_4/beta1
conv2d_5/kernel1
batch_normalization_5/beta1
conv2d_6/kernel1
batch_normalization_6/beta1
conv2d_7/kernel1
batch_normalization_7/beta1
conv2d_8/kernel1
batch_normalization_8/beta1
conv2d_9/kernel1
batch_normalization_9/beta1
conv2d_10/kernel1
batch_normalization_10/beta1
conv2d_11/kernel1
batch_normalization_11/beta1
conv2d_12/kernel1
batch_normalization_12/beta1
conv2d_13/kernel1
batch_normalization_13/beta1
conv2d_14/kernel1
batch_normalization_14/beta1
conv2d_15/kernel1
batch_normalization_15/beta1
conv2d_16/kernel1
batch_normalization_16/beta1
conv2d_17/kernel1
batch_normalization_17/beta1
conv2d_18/kernel1
batch_normalization_18/beta1
conv2d_19/kernel1
batch_normalization_19/beta1
conv2d_20/kernel1
batch_normalization_20/beta1
conv2d_21/kernel1
batch_normalization_21/beta1
conv2d_22/kernel1
batch_normalization_22/beta1
conv2d_23/kernel1
batch_normalization_23/beta1
conv2d_24/kernel1
batch_normalization_24/beta1
conv2d_25/kernel1
batch_normalization_25/beta1
conv2d_26/kernel1
batch_normalization_26/beta1
conv2d_27/kernel1
batch_normalization_27/beta1
conv2d_28/kernel1
batch_normalization_28/beta1
conv2d_29/kernel1
batch_normalization_29/beta1
conv2d_30/kernel1
batch_normalization_30/beta1
conv2d_31/kernel1
batch_normalization_31/beta1
conv2d_32/kernel1
batch_normalization_32/beta1
conv2d_33/kernel1
batch_normalization_33/beta1
conv2d_34/kernel1
batch_normalization_34/beta1
conv2d_35/kernel1
batch_normalization_35/beta1
conv2d_36/kernel1
batch_normalization_36/beta1
conv2d_37/kernel1
batch_normalization_37/beta1
conv2d_38/kernel1
batch_normalization_38/beta1
conv2d_39/kernel1
batch_normalization_39/beta1
conv2d_40/kernel1
batch_normalization_40/beta1
conv2d_41/kernel1
batch_normalization_41/beta1
conv2d_42/kernel1
batch_normalization_42/beta1
conv2d_43/kernel1
batch_normalization_43/beta1
conv2d_44/kernel1
batch_normalization_44/beta1
conv2d_45/kernel1
batch_normalization_45/beta1
conv2d_46/kernel1
batch_normalization_46/beta1
conv2d_47/kernel1
batch_normalization_47/beta1
conv2d_48/kernel1
batch_normalization_48/beta1
conv2d_49/kernel1
batch_normalization_49/beta1
conv2d_50/kernel1
batch_normalization_50/beta1
conv2d_51/kernel1
batch_normalization_51/beta1
conv2d_52/kernel1
batch_normalization_52/beta1
conv2d_53/kernel1
batch_normalization_53/beta1
conv2d_54/kernel1
batch_normalization_54/beta1
conv2d_55/kernel1
batch_normalization_55/beta1
conv2d_56/kernel1
batch_normalization_56/beta1
conv2d_57/kernel1
batch_normalization_57/beta1
conv2d_58/kernel1
batch_normalization_58/beta1
conv2d_59/kernel1
batch_normalization_59/beta1
conv2d_60/kernel1
batch_normalization_60/beta1
conv2d_61/kernel1
batch_normalization_61/beta1
conv2d_62/kernel1
batch_normalization_62/beta1
conv2d_63/kernel1
batch_normalization_63/beta1
conv2d_64/kernel1
batch_normalization_64/beta1
conv2d_65/kernel1
batch_normalization_65/beta1
conv2d_66/kernel1
batch_normalization_66/beta1
conv2d_67/kernel1
batch_normalization_67/beta1
conv2d_68/kernel1
batch_normalization_68/beta1
conv2d_69/kernel1
batch_normalization_69/beta1
conv2d_70/kernel1
batch_normalization_70/beta1
conv2d_71/kernel1
batch_normalization_71/beta1
conv2d_72/kernel1
batch_normalization_72/beta1
conv2d_73/kernel1
batch_normalization_73/beta1
conv2d_74/kernel1
batch_normalization_74/beta1
conv2d_75/kernel1
batch_normalization_75/beta1
conv2d_76/kernel1
batch_normalization_76/beta1
conv2d_77/kernel1
batch_normalization_77/beta1
conv2d_78/kernel1
batch_normalization_78/beta1
conv2d_79/kernel1
batch_normalization_79/beta1
conv2d_80/kernel1
batch_normalization_80/beta1
conv2d_81/kernel1
batch_normalization_81/beta1
conv2d_82/kernel1
batch_normalization_82/beta1
conv2d_83/kernel1
batch_normalization_83/beta1
conv2d_84/kernel1
batch_normalization_84/beta1
conv2d_85/kernel1
batch_normalization_85/beta1
conv2d_86/kernel1
batch_normalization_86/beta1
conv2d_87/kernel1
batch_normalization_87/beta1
conv2d_88/kernel1
batch_normalization_88/beta1
conv2d_89/kernel1
batch_normalization_89/beta1
conv2d_90/kernel1
batch_normalization_90/beta1
conv2d_91/kernel1
batch_normalization_91/beta1
conv2d_92/kernel1
batch_normalization_92/beta1
conv2d_93/kernel1
batch_normalization_93/beta1
conv2d_94/kernel1
batch_normalization_94/beta1
Dense_1/kernel1
Dense_1/bias1
Dense_2/kernel1
Dense_2/bias1
The text was updated successfully, but these errors were encountered: