-
Notifications
You must be signed in to change notification settings - Fork 84
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: Unknown layer: BilinearUpSampling2D #19
Comments
Hi @LetsExplore11, that is because BilinearUpSampling2D is a custom layer. You need to pass it as an argument to the load_model. I've updated the infer.py script on master and now it should work. |
Hi @JihongJu Thank you. I am completely new in keras. So I have some common questions.
|
@LetsExplore11 I haven't tested the metrics in score.py and they were implemented with |
Hi @JihongJu Thanks for your quick reply. As I am new in keras, So I do not fully understand your explanation. I beg a pardon for this reason. Could you please suggest me some way to make a way. |
@LetsExplore11 Ok. The current mean_IU has nothing to do with keras. It is a function manipulating numpy arrays. You can use a trained model to predict the segmentation. Once you have the segmentation, you can use the |
Dear @JihongJu thank you very much for sharing your code work. The train.py is running smoothly. After that, I want to run the infer.py, then the following line raise an error like this "ValueError: Unknown layer: BilinearUpSampling2D"
model = load_model('output/fcn_vgg16_weights.h5',
custom_objects={'CroppingLike2D': CroppingLike2D,
#'mean_categorical_crossentropy': mean_categorical_crossentropy})
'flatten_categorical_crossentropy': flatten_categorical_crossentropy(classes=21)})
I have tried to find out a solution but I can't make it right. could you please suggest me a way to solve this error.
One more thing, Do I need to write a new python file for call the score.py files method or somethings else?
Thanks in advanced
The text was updated successfully, but these errors were encountered: