Skip to content
This repository has been archived by the owner on Jul 1, 2024. It is now read-only.

model.compile is failing when using inceptionV3 (irrespective of weights settings) #119

Closed
rajendra2 opened this issue Jun 18, 2018 · 9 comments

Comments

@rajendra2
Copy link

I am trying following code and it is failing during model.compile. I am using

keras-mxnet==2.1.6.1
mxnet-cu92==1.2.0

import keras
from keras.applications.inception_v3 import InceptionV3
from keras.layers.pooling import GlobalMaxPooling2D
from keras.layers.core import Dense
from keras.layers.core import Dropout
from keras.layers import Input
from keras.models import Model
from keras.regularizers import *
from keras.preprocessing.image import ImageDataGenerator

def get_model():
	aliases = {}
	Input_1 = Input(shape=(3, 256, 256), name='Input_1')
	InceptionV3_1_model = InceptionV3(include_top= False, weights=None, input_tensor = Input_1)
	InceptionV3_1 = InceptionV3_1_model(Input_1)
	GlobalMaxPooling2D_1 = GlobalMaxPooling2D(name='GlobalMaxPooling2D_1')(InceptionV3_1)
	Dense_1 = Dense(name='Dense_1',units= 512,activation= 'relu' )(GlobalMaxPooling2D_1)
	Dropout_1 = Dropout(name='Dropout_1',rate= 0.5)(Dense_1)
	Dense_2 = Dense(name='Dense_2',units= 2,activation= 'softmax' )(Dropout_1)

	model = Model([Input_1],[Dense_2])
	return model

model = get_model()
model.compile(loss='categorical_crossentropy', optimizer='adadelta')

ValueError Traceback (most recent call last)
in ()
----> 1 model.compile(loss='categorical_crossentropy', optimizer='adadelta')

/usr/local/lib/python3.6/dist-packages/keras/backend/mxnet_backend.py in compile(self, optimizer, loss, metrics, loss_weights, sample_weight_mode, **kwargs)
4480 default_bucket_key='pred',
4481 context=self._context,
-> 4482 fixed_param_names=self._fixed_weights)
4483 set_model(self)
4484 self.compiled = True

/usr/local/lib/python3.6/dist-packages/mxnet/module/bucketing_module.py in init(self, sym_gen, default_bucket_key, logger, context, work_load_list, fixed_param_names, state_names, group2ctxs, compression_params)
82 _check_input_names(symbol, label_names, "label", False)
83 _check_input_names(symbol, state_names, "state", True)
---> 84 _check_input_names(symbol, fixed_param_names, "fixed_param", True)
85
86 self._compression_params = compression_params

/usr/local/lib/python3.6/dist-packages/mxnet/module/base_module.py in _check_input_names(symbol, names, typename, throw)
50 typename, str(names), name, '\n\t'.join(candidates))
51 if throw:
---> 52 raise ValueError(msg)
53 else:
54 warnings.warn(msg)

ValueError: You created Module with Module(..., fixed_param_names=['batchnorm248_gamma', 'batchnorm342_gamma', 'batchnorm212_gamma', 'batchnorm220_gamma', 'batchnorm252_gamma', 'batchnorm266_gamma', 'batchnorm232_gamma', 'batchnorm356_gamma', 'batchnorm192_gamma', 'batchnorm224_gamma', 'batchnorm204_gamma', 'batchnorm222_gamma', 'batchnorm250_gamma', 'batchnorm314_gamma', 'batchnorm270_gamma', 'batchnorm254_gamma', 'batchnorm304_gamma', 'batchnorm236_gamma', 'batchnorm272_gamma', 'batchnorm202_gamma', 'batchnorm298_gamma', 'batchnorm326_gamma', 'batchnorm216_gamma', 'batchnorm288_gamma', 'batchnorm346_gamma', 'batchnorm366_gamma', 'batchnorm300_gamma', 'batchnorm336_gamma', 'batchnorm260_gamma', 'batchnorm208_gamma', 'batchnorm350_gamma', 'batchnorm338_gamma', 'batchnorm258_gamma', 'batchnorm200_gamma', 'batchnorm296_gamma', 'batchnorm284_gamma', 'batchnorm280_gamma', 'batchnorm334_gamma', 'batchnorm322_gamma', 'batchnorm196_gamma', 'batchnorm262_gamma', 'batchnorm240_gamma', 'batchnorm308_gamma', 'batchnorm306_gamma', 'batchnorm206_gamma', 'batchnorm276_gamma', 'batchnorm190_gamma', 'batchnorm344_gamma', 'batchnorm234_gamma', 'batchnorm268_gamma', 'batchnorm242_gamma', 'batchnorm218_gamma', 'batchnorm274_gamma', 'batchnorm320_gamma', 'batchnorm290_gamma', 'batchnorm278_gamma', 'batchnorm286_gamma', 'batchnorm362_gamma', 'batchnorm256_gamma', 'batchnorm316_gamma', 'batchnorm244_gamma', 'batchnorm246_gamma', 'batchnorm340_gamma', 'batchnorm328_gamma', 'batchnorm374_gamma', 'batchnorm194_gamma', 'batchnorm368_gamma', 'batchnorm358_gamma', 'batchnorm302_gamma', 'batchnorm312_gamma', 'batchnorm332_gamma', 'batchnorm370_gamma', 'batchnorm364_gamma', 'batchnorm188_gamma', 'batchnorm324_gamma', 'batchnorm238_gamma', 'batchnorm292_gamma', 'batchnorm330_gamma', 'batchnorm294_gamma', 'batchnorm230_gamma', 'batchnorm354_gamma', 'batchnorm226_gamma', 'batchnorm228_gamma', 'batchnorm372_gamma', 'batchnorm348_gamma', 'batchnorm282_gamma', 'batchnorm214_gamma', 'batchnorm264_gamma', 'batchnorm198_gamma', 'batchnorm318_gamma', 'batchnorm352_gamma', 'batchnorm210_gamma', 'batchnorm360_gamma', 'batchnorm310_gamma']) but input with name 'batchnorm248_gamma' is not found in symbol.list_arguments(). Did you mean one of:
/Input_11
conv2d_1/kernel1
batch_normalization_1/beta1
conv2d_2/kernel1
batch_normalization_2/beta1
conv2d_3/kernel1
batch_normalization_3/beta1
conv2d_4/kernel1
batch_normalization_4/beta1
conv2d_5/kernel1
batch_normalization_5/beta1
conv2d_6/kernel1
batch_normalization_6/beta1
conv2d_7/kernel1
batch_normalization_7/beta1
conv2d_8/kernel1
batch_normalization_8/beta1
conv2d_9/kernel1
batch_normalization_9/beta1
conv2d_10/kernel1
batch_normalization_10/beta1
conv2d_11/kernel1
batch_normalization_11/beta1
conv2d_12/kernel1
batch_normalization_12/beta1
conv2d_13/kernel1
batch_normalization_13/beta1
conv2d_14/kernel1
batch_normalization_14/beta1
conv2d_15/kernel1
batch_normalization_15/beta1
conv2d_16/kernel1
batch_normalization_16/beta1
conv2d_17/kernel1
batch_normalization_17/beta1
conv2d_18/kernel1
batch_normalization_18/beta1
conv2d_19/kernel1
batch_normalization_19/beta1
conv2d_20/kernel1
batch_normalization_20/beta1
conv2d_21/kernel1
batch_normalization_21/beta1
conv2d_22/kernel1
batch_normalization_22/beta1
conv2d_23/kernel1
batch_normalization_23/beta1
conv2d_24/kernel1
batch_normalization_24/beta1
conv2d_25/kernel1
batch_normalization_25/beta1
conv2d_26/kernel1
batch_normalization_26/beta1
conv2d_27/kernel1
batch_normalization_27/beta1
conv2d_28/kernel1
batch_normalization_28/beta1
conv2d_29/kernel1
batch_normalization_29/beta1
conv2d_30/kernel1
batch_normalization_30/beta1
conv2d_31/kernel1
batch_normalization_31/beta1
conv2d_32/kernel1
batch_normalization_32/beta1
conv2d_33/kernel1
batch_normalization_33/beta1
conv2d_34/kernel1
batch_normalization_34/beta1
conv2d_35/kernel1
batch_normalization_35/beta1
conv2d_36/kernel1
batch_normalization_36/beta1
conv2d_37/kernel1
batch_normalization_37/beta1
conv2d_38/kernel1
batch_normalization_38/beta1
conv2d_39/kernel1
batch_normalization_39/beta1
conv2d_40/kernel1
batch_normalization_40/beta1
conv2d_41/kernel1
batch_normalization_41/beta1
conv2d_42/kernel1
batch_normalization_42/beta1
conv2d_43/kernel1
batch_normalization_43/beta1
conv2d_44/kernel1
batch_normalization_44/beta1
conv2d_45/kernel1
batch_normalization_45/beta1
conv2d_46/kernel1
batch_normalization_46/beta1
conv2d_47/kernel1
batch_normalization_47/beta1
conv2d_48/kernel1
batch_normalization_48/beta1
conv2d_49/kernel1
batch_normalization_49/beta1
conv2d_50/kernel1
batch_normalization_50/beta1
conv2d_51/kernel1
batch_normalization_51/beta1
conv2d_52/kernel1
batch_normalization_52/beta1
conv2d_53/kernel1
batch_normalization_53/beta1
conv2d_54/kernel1
batch_normalization_54/beta1
conv2d_55/kernel1
batch_normalization_55/beta1
conv2d_56/kernel1
batch_normalization_56/beta1
conv2d_57/kernel1
batch_normalization_57/beta1
conv2d_58/kernel1
batch_normalization_58/beta1
conv2d_59/kernel1
batch_normalization_59/beta1
conv2d_60/kernel1
batch_normalization_60/beta1
conv2d_61/kernel1
batch_normalization_61/beta1
conv2d_62/kernel1
batch_normalization_62/beta1
conv2d_63/kernel1
batch_normalization_63/beta1
conv2d_64/kernel1
batch_normalization_64/beta1
conv2d_65/kernel1
batch_normalization_65/beta1
conv2d_66/kernel1
batch_normalization_66/beta1
conv2d_67/kernel1
batch_normalization_67/beta1
conv2d_68/kernel1
batch_normalization_68/beta1
conv2d_69/kernel1
batch_normalization_69/beta1
conv2d_70/kernel1
batch_normalization_70/beta1
conv2d_71/kernel1
batch_normalization_71/beta1
conv2d_72/kernel1
batch_normalization_72/beta1
conv2d_73/kernel1
batch_normalization_73/beta1
conv2d_74/kernel1
batch_normalization_74/beta1
conv2d_75/kernel1
batch_normalization_75/beta1
conv2d_76/kernel1
batch_normalization_76/beta1
conv2d_77/kernel1
batch_normalization_77/beta1
conv2d_78/kernel1
batch_normalization_78/beta1
conv2d_79/kernel1
batch_normalization_79/beta1
conv2d_80/kernel1
batch_normalization_80/beta1
conv2d_81/kernel1
batch_normalization_81/beta1
conv2d_82/kernel1
batch_normalization_82/beta1
conv2d_83/kernel1
batch_normalization_83/beta1
conv2d_84/kernel1
batch_normalization_84/beta1
conv2d_85/kernel1
batch_normalization_85/beta1
conv2d_86/kernel1
batch_normalization_86/beta1
conv2d_87/kernel1
batch_normalization_87/beta1
conv2d_88/kernel1
batch_normalization_88/beta1
conv2d_89/kernel1
batch_normalization_89/beta1
conv2d_90/kernel1
batch_normalization_90/beta1
conv2d_91/kernel1
batch_normalization_91/beta1
conv2d_92/kernel1
batch_normalization_92/beta1
conv2d_93/kernel1
batch_normalization_93/beta1
conv2d_94/kernel1
batch_normalization_94/beta1
Dense_1/kernel1
Dense_1/bias1
Dense_2/kernel1
Dense_2/bias1

@kalyc
Copy link

kalyc commented Jun 18, 2018

Thanks for submitting this issue @rajendra2
@roywei could you add label "Bug", "Training" to this?

@roywei
Copy link

roywei commented Jun 18, 2018

@rajendra2 I believe it's the conv2d_bn() used in keras.applications.inception_v3.

Could you try removing all the params in BatchNormalization like this at line 82:

x = BatchNormalization()(x)

It works on my end, you may want to re-install keras from source after you changed inception_v3.
We will make a fix on this with our next release.

@rajendra2
Copy link
Author

I tried removing all the params from BatchNormalization (line 82 in /usr/local/lib/python3.6/dist-packages/keras/applications/inception_v3.py
). It didn't make any difference.

I am looking forward to next release. Any tentative date?

@roywei
Copy link

roywei commented Jun 19, 2018

@rajendra2 Using the workaround works on my side. Next release on June 22. Meanwhile you can try to clone the project, make the change on Inception, and source install using python setup.py install

@rajendra2
Copy link
Author

@roywei I just tried the new release (keras-mxnet==2.2.0). I still get the same error when I run the my original code in Jupyter. I also tried removing all the params from BatchNormalization (/usr/local/lib/python3.6/dist-packages/keras_applications/inception_v3.py) but that is not helping.

Does my code work in your environment with the pypi 2.2.0 release ?

@roywei
Copy link

roywei commented Jun 23, 2018

@rajendra2 sorry, unfortunately we were not able to change this in 2.2.0. We found out in Keras 2.2.0, the application module was moved out into a separable repo in keras-team. We are not able to make any custom change on that for now. Will investigate a work around to modify applications modules for mxnet.
However, re-install from source with the change I mentioned should work for now.

@roywei
Copy link

roywei commented Jul 12, 2018

root cause found: #136

@roywei
Copy link

roywei commented Jul 23, 2018

#137 should fix this, pending merging back to master branch

@roywei
Copy link

roywei commented Aug 20, 2018

closing as fixed in #148

@roywei roywei closed this as completed Aug 20, 2018
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

3 participants