Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

After pruning the model, is model type always NoneType? #164

Closed
bothrasumit opened this issue Nov 27, 2019 · 10 comments
Closed

After pruning the model, is model type always NoneType? #164

bothrasumit opened this issue Nov 27, 2019 · 10 comments
Assignees
Labels
technique:pruning Regarding tfmot.sparsity.keras APIs and docs

Comments

@bothrasumit
Copy link

I am trying to prune a pre-trained custom model.
After calling prune_low_magnitude, i see that the pruned_model has type "NoneType".
Is this wrong?

@alanchiao
Copy link

The model return type should be an instance of tf.keras.Model, the same as the input to prune_low_magnitude.

If you update your issue to provide the information from New Issue > Bug Report, I can reproduce your issue and see what's happening.

@gabrielibagon
Copy link

Hi @alanchiao, I'm experiencing the same issue with pruning a MaskRCNN model from this repository: https://github.com/matterport/Mask_RCNN/

Note the requirements for MaskRCNN: "Python 3.4, TensorFlow 1.3, Keras 2.0.8"

Judging from this response from @s36srini, the issue could potentially be the Keras version, though I do not believe the MaskRCNN model is Sequential. It does however use keras rather than tensorflow.keras, but converting the codebase does not seem straightforward. I'm hoping there is a way to support the keras model as-is or with minimal changes.

Here is a completely reproducible Colab notebook of building the MaskRCNN and pruning it (returning a NoneType object). I'd appreciate feedback on this:

https://colab.research.google.com/drive/1UyM6T4UXXM5G8w3-p1U2yN2iZ_wcTDJj

@alanchiao
Copy link

@gabrielibagon : see the discussion in this issue. Similarly, this Mask_RCNN is not using tf.keras.

@bothrasumit
Copy link
Author

The model return type should be an instance of tf.keras.Model, the same as the input to prune_low_magnitude.

If you update your issue to provide the information from New Issue > Bug Report, I can reproduce your issue and see what's happening.

Hi @alanchiao ,
The model which is passed to pruning API has type "keras.engine.training.Model" which is not tf.keras.Model.

How do I convert my model type into the required type?
I have another question, can pruning be done during training and post training ? I am trying to prune it post training.

@alanchiao
Copy link

@bothrasumit : the main question I have is: was your model built via the Keras-team keras project or the TensorFlow Keras API. The pruning API is only compatible with the latter. If it's the former, I'm not aware of a way to convert it to a tf.keras model. You will need to work with whoever trained the original model to move them to tf.keras and retrain, or use one of the readily available models from the TensorFlow ecosystem.

Please see this overview page to answer your question on during training vs post training. It happens during training.

@bothrasumit
Copy link
Author

bothrasumit commented Dec 6, 2019

@bothrasumit : the main question I have is: was your model built via the Keras-team keras project or the TensorFlow Keras API. The pruning API is only compatible with the latter. If it's the former, I'm not aware of a way to convert it to a tf.keras model. You will need to work with whoever trained the original model to move them to tf.keras and retrain, or use one of the readily available models from the TensorFlow ecosystem.

Please see this overview page to answer your question on during training vs post training. It happens during training.

Thank You @alanchiao , I was training the model with former only. I am trying with latter now.
Is there any data that says that by how much percent the inference time can be reduced while using pruning API?

@alanchiao
Copy link

@bothrasumit: as the docs suggest, the current implementation only results in model size reduction.

I created #173 as an issue that you can follow. We have achieved significant latency improvements with usage within Google and have deployed the latency improvements in products. However, I won't put out any numbers until we have something ready for everyone to use.

@alanchiao alanchiao self-assigned this Dec 11, 2019
@bothrasumit
Copy link
Author

@alanchiao Thank you for the update.
I am able to prune the model now.

@bothrasumit
Copy link
Author

@alanchiao
Usually, Is there any difference in training time while pruning?

@alanchiao
Copy link

There is some training time difference, though it should vary depending on the model type (less so for conv due to smaller weight to activation ratio and more for LSTMs).

@alanchiao alanchiao added the technique:pruning Regarding tfmot.sparsity.keras APIs and docs label Feb 6, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
technique:pruning Regarding tfmot.sparsity.keras APIs and docs
Projects
None yet
Development

No branches or pull requests

3 participants