Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: can't set attribute when using pre-trained extractive summariser #46

Closed
bmanczak opened this issue May 11, 2021 · 5 comments

Comments

@bmanczak
Copy link

Hi,

Great work on the repo. I followed the Getting Started page and tried to run mobilebert-uncased-ext-sum model. Here is a simple code snippet I used:

import os
import sys

sys.path.insert(0, os.path.abspath("./src"))
from extractive import ExtractiveSummarizer

model = ExtractiveSummarizer.load_from_checkpoint("mobilebert-uncased-ext-sum.ckpt")

text_to_summarize = "my long text."
 
model.predict(text_to_summarize) 

However, I get the following traceback

Traceback (most recent call last):
  File "/Users/blazejmanczak/Desktop/Projects/Artemos/ext_summarization/transformersum/testing_extractive.py", line 8, in <module>
    model = ExtractiveSummarizer.load_from_checkpoint("mobilebert-uncased-ext-sum.ckpt")
  File "/opt/miniconda3/envs/transformersum/lib/python3.9/site-packages/pytorch_lightning/core/saving.py", line 157, in load_from_checkpoint
    model = cls._load_model_state(checkpoint, strict=strict, **kwargs)
  File "/opt/miniconda3/envs/transformersum/lib/python3.9/site-packages/pytorch_lightning/core/saving.py", line 199, in _load_model_state
    model = cls(**_cls_kwargs)
  File "/Users/blazejmanczak/Desktop/Projects/Artemos/ext_summarization/transformersum/src/extractive.py", line 109, in __init__
    self.hparams = hparams
  File "/opt/miniconda3/envs/transformersum/lib/python3.9/site-packages/torch/nn/modules/module.py", line 995, in __setattr__
    object.__setattr__(self, name, value)
AttributeError: can't set attribute

Any tips on how to solve that?

@HHousen
Copy link
Owner

HHousen commented May 11, 2021

I've narrowed the problem down to this commit in pytorch-lightning Lightning-AI/pytorch-lightning#6207. Try running after installing the previous version of pytorch-lightning with pip install -U pytorch_lightning==1.2.10. I'm working on fixing the code to work with the latest version of pytorch-lightning.

@HHousen
Copy link
Owner

HHousen commented May 11, 2021

For reference, I found an open issue with this exact problem: Lightning-AI/pytorch-lightning#7443 (comment)

@HHousen
Copy link
Owner

HHousen commented May 11, 2021

By the way, you're going to need to set strict=False when calling load_from_checkpoint with that model.

@HHousen
Copy link
Owner

HHousen commented May 11, 2021

Solved in e36e033

@HHousen HHousen closed this as completed May 11, 2021
@bmanczak
Copy link
Author

Thank you for a swift patch!

To make it work I had to do one small thing:
In src/extractive.py change nlp.add_pipe(sentencizer) to nlp.add_pipe("sentencizer").

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants