Skip to content

Releases: utterworks/fast-bert

Added LR Finder for Text classification task

09 Jul 12:05
Compare
Choose a tag to compare

Release 1.8.0 - New Learning Rate finder integrated with learner object.

Switch text classification to AutoModel

14 Apr 21:58
53773ab
Compare
Choose a tag to compare

We have switched to Auto-model for Multi-class classification. This would let you train any pretrained model architecture for text classification.

Includes Abstractive summarisation

22 Dec 14:27
10ee7a0
Compare
Choose a tag to compare

Now supports the initial version of Abstractive Summarisation inference, fast-bert style

In a not so future release, you will be able to use your custom language model fine-tuned on custom corpus for the encoder model.

Bug fixes

14 Dec 13:10
Compare
Choose a tag to compare

Fixed some of the bugs related to fastai dependencies.

New model architectures - ALBERT, CamemBERT, DistilRoberta

28 Nov 23:57
Compare
Choose a tag to compare

Three new models have been added in v1.5.0

    • ALBERT (Pytorch) (from Google Research and the Toyota Technological Institute at Chicago) released with the paper ALBERT: A Lite BERT for Self-supervised Learning of Language Representations, by Zhenzhong Lan, Mingda Chen, Sebastian Goodman, Kevin Gimpel, Piyush Sharma, Radu Soricut.
      CamemBERT (Pytorch) (from Facebook AI Research, INRIA, and La Sorbonne Université), as the first large-scale Transformer language model. Released alongside the paper CamemBERT: a Tasty French Language Model by Louis Martin, Benjamin Muller, Pedro Javier Ortiz Suarez, Yoann Dupont, Laurent Romary, Eric Villemonte de la Clergerie, Djame Seddah, and Benoît Sagot. It was added by @louismartin with the help of @julien-c.
      DistilRoberta (Pytorch) from @VictorSanh as the third distilled model after DistilBERT and DistilGPT-2.