-
Purpose of this project is to compare text summarization of T5 and Bart transformers.
-
76000+ reviews of hotels are used as a dataset for this project taken from here:
In order to run this, you need anaconda installed on your machine with pytorch and huggingface transformers installed.
Below is the link for installation instructions for Huggingface Transformers:
You can run this project in Jupyter Notebook as I used Google Colaboratory.
There are some code differences as my paths are from Google Drive, so you can change them as required.
- Rouge-l used as evaluation metric is being taken from this GitHub repository:
For this awesome implementation of Rouge-l, thanks to Pengcheng YIN