Releases: amarczew/pytorch_model_summary
Releases · amarczew/pytorch_model_summary
0.1.2: Fixing minor bugs
Fixing Double Counting Parameters
#3 issue and PR Submodule can be None and should be ignored by hierarchical_summary
#4
Thank you so much @daniel347x, @TortoiseHam and @jonashaag for the feedbacks
0.1.1
First improved version of `model summary` library
This is an Improved PyTorch library of modelsummary
. Like in modelsummary, It does not care with number of Input parameter!
Improvements:
-
For user defined pytorch layers, now
summary
can show layers inside it- some assumptions: when is an user defined layer, if any weight/params/bias is trainable, then it is assumed that this layer is trainable (but only trainable params are counted in Tr. Params #)
-
Adding column counting only trainable parameters (it makes sense when there are user defined layers)
-
Showing all input/output shapes, instead of showing only the first one
- example: LSTM layer return a Tensor and a tuple (Tensor, Tensor), then output_shape has three set of values
-
Printing: table width defined dynamically
-
Adding option to add hierarchical summary in output
-
Adding batch_size value (when provided) in table footer
-
fix bugs