Skip to content

Releases: amarczew/pytorch_model_summary

0.1.2: Fixing minor bugs

30 Aug 18:28
Compare
Choose a tag to compare

Fixing Double Counting Parameters #3 issue and PR Submodule can be None and should be ignored by hierarchical_summary #4

Thank you so much @daniel347x, @TortoiseHam and @jonashaag for the feedbacks

0.1.1

24 Dec 16:31
Compare
Choose a tag to compare
Making sure that weights and BN params won't be change during net str…

First improved version of `model summary` library

24 Dec 16:14
Compare
Choose a tag to compare

This is an Improved PyTorch library of modelsummary. Like in modelsummary, It does not care with number of Input parameter!

Improvements:

  • For user defined pytorch layers, now summary can show layers inside it

    • some assumptions: when is an user defined layer, if any weight/params/bias is trainable, then it is assumed that this layer is trainable (but only trainable params are counted in Tr. Params #)
  • Adding column counting only trainable parameters (it makes sense when there are user defined layers)

  • Showing all input/output shapes, instead of showing only the first one

    • example: LSTM layer return a Tensor and a tuple (Tensor, Tensor), then output_shape has three set of values
  • Printing: table width defined dynamically

  • Adding option to add hierarchical summary in output

  • Adding batch_size value (when provided) in table footer

  • fix bugs