Skip to content

Releases: eagomez2/moduleprofiler

moduleprofiler v0.0.4

25 Sep 23:20
f3cfd52
Compare
Choose a tag to compare
  • Update documentation.
  • Add ops estimation for torch.nn.BatchNorm1d and torch.nn.BatchNorm2d.
  • Add exclude_from_ops option to ModuleProfiler.
  • Fix NaN bit sizes and data types for modules without trainable or nontrainable parameters.

moduleprofiler v0.0.3

25 Sep 20:50
f1d606b
Compare
Choose a tag to compare
  • Added torch.nn.LayerNorm reference.
  • Update torch.nn.LSTM documentation.
  • Fixed torch.nn.LayerNorm estimation.

moduleprofiler v0.0.2

11 Sep 21:03
Compare
Choose a tag to compare
  • Updated documentation
  • Fixed ConvTranspose1d and ConvTranspose2d overestimation of additions per filter.

moduleprofiler v0.0.1

09 Sep 21:26
Compare
Choose a tag to compare

First release of moduleprofiler, a fee open-source package to profile torch.nn.Module modules and obtain useful information to design a model that fits your needs and constraints at development time.

With moduleprofiler you can:

  • Calculate the number of parameters of your model.
  • Trace the input and output sizes of each component of your model.
  • Estimate the number of operations your model performs in a forward pass.
  • Calculate per module and total inference time.

All results can be obtained in one of the following formats:

  • dict (default output format)
  • pandas.DataFrame (to perform further calculations or filtering in your code)
  • html (to export as webpage)
  • LaTeX (to include in your publications)

[Online documentation | Tutorial ]