-
Notifications
You must be signed in to change notification settings - Fork 324
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
After pruning the model, is model type always NoneType? #164
Comments
The model return type should be an instance of tf.keras.Model, the same as the input to prune_low_magnitude. If you update your issue to provide the information from New Issue > Bug Report, I can reproduce your issue and see what's happening. |
Hi @alanchiao, I'm experiencing the same issue with pruning a MaskRCNN model from this repository: https://github.com/matterport/Mask_RCNN/ Note the requirements for MaskRCNN: "Python 3.4, TensorFlow 1.3, Keras 2.0.8" Judging from this response from @s36srini, the issue could potentially be the Keras version, though I do not believe the MaskRCNN model is Sequential. It does however use Here is a completely reproducible Colab notebook of building the MaskRCNN and pruning it (returning a NoneType object). I'd appreciate feedback on this: https://colab.research.google.com/drive/1UyM6T4UXXM5G8w3-p1U2yN2iZ_wcTDJj |
@gabrielibagon : see the discussion in this issue. Similarly, this Mask_RCNN is not using tf.keras. |
Hi @alanchiao , How do I convert my model type into the required type? |
@bothrasumit : the main question I have is: was your model built via the Keras-team keras project or the TensorFlow Keras API. The pruning API is only compatible with the latter. If it's the former, I'm not aware of a way to convert it to a tf.keras model. You will need to work with whoever trained the original model to move them to tf.keras and retrain, or use one of the readily available models from the TensorFlow ecosystem. Please see this overview page to answer your question on during training vs post training. It happens during training. |
Thank You @alanchiao , I was training the model with former only. I am trying with latter now. |
@bothrasumit: as the docs suggest, the current implementation only results in model size reduction. I created #173 as an issue that you can follow. We have achieved significant latency improvements with usage within Google and have deployed the latency improvements in products. However, I won't put out any numbers until we have something ready for everyone to use. |
@alanchiao Thank you for the update. |
@alanchiao |
There is some training time difference, though it should vary depending on the model type (less so for conv due to smaller weight to activation ratio and more for LSTMs). |
I am trying to prune a pre-trained custom model.
After calling prune_low_magnitude, i see that the pruned_model has type "NoneType".
Is this wrong?
The text was updated successfully, but these errors were encountered: