We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
It would be good to support encoder-decoder models. Bart and t5 currently don't work.
from datasets import load_dataset from transformer_ranker import TransformerRanker dataset = load_dataset('conll2003') language_models = ["facebook/bart-base", "google-t5/t5-base"] ranker = TransformerRanker(dataset, dataset_downsample=0.2) results = ranker.run(language_models, batch_size=64)
Error message:
transformer_ranker:Models found in cache: ['facebook/bart-base', 'google-t5/t5-base'] Retrieving Embeddings: 0%| | 0/65 [00:00<?, ?it/s] --------------------------------------------------------------------------- AttributeError Traceback (most recent call last) <ipython-input-5-8f99370783b7> in <cell line: 16>() 14 dataset_downsample=0.2) 15 ---> 16 results = ranker.run(language_models, batch_size=64) 2 frames /usr/local/lib/python3.10/dist-packages/transformer_ranker/embedder.py in embed_batch(self, sentences, move_embeddings_to_cpu) 141 hidden_states = self.model( 142 input_ids, attention_mask=attention_mask, output_hidden_states=True --> 143 ).hidden_states 144 145 # Exclude the embedding layer (index 0) AttributeError: 'Seq2SeqModelOutput' object has no attribute 'hidden_states'
The text was updated successfully, but these errors were encountered:
No branches or pull requests
It would be good to support encoder-decoder models. Bart and t5 currently don't work.
Error message:
The text was updated successfully, but these errors were encountered: