Skip to content

Commit

Permalink
Format fixes, add Attanasio et al. (2023) to readme
Browse files Browse the repository at this point in the history
  • Loading branch information
gsarti committed Oct 19, 2023
1 parent d53be07 commit a1bb881
Show file tree
Hide file tree
Showing 3 changed files with 9 additions and 9 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -268,6 +268,7 @@ Inseq has been used in various research projects. A list of known publications t
<li> <a href="https://arxiv.org/abs/2302.14220">Are Character-level Translations Worth the Wait? Comparing Character- and Subword-level Models for Machine Translation</a> (Edman et al., 2023) </li>
<li> <a href="https://aclanthology.org/2023.nlp4convai-1.1/">Response Generation in Longitudinal Dialogues: Which Knowledge Representation Helps?</a> (Mousavi et al., 2023) </li>
<li> <a href="https://arxiv.org/abs/2310.01188">Quantifying the Plausibility of Context Reliance in Neural Machine Translation</a> (Sarti et al., 2023)</li>
<li> <a href="https://arxiv.org/abs/2310.12127">A Tale of Pronouns: Interpretability Informs Gender Bias Mitigation for Fairer Instruction-Tuned Machine Translation</a> (Attanasio et al., 2023)</li>
</ol>

</details>
9 changes: 4 additions & 5 deletions inseq/models/attribution_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -404,11 +404,10 @@ def attribute(
generated_texts = self.generate(
encoded_input, return_generation_output=False, batch_size=batch_size, **generation_args
)
else:
if generation_args:
logger.warning(
f"Generation arguments {generation_args} are provided, but will be ignored (constrained decoding)."
)
elif generation_args:
logger.warning(
f"Generation arguments {generation_args} are provided, but will be ignored (constrained decoding)."
)
logger.debug(f"reference_texts={generated_texts}")
attribution_method = self.get_attribution_method(method, override_default_attribution)
attributed_fn = self.get_attributed_fn(attributed_fn)
Expand Down
8 changes: 4 additions & 4 deletions inseq/utils/alignment_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -74,10 +74,10 @@ def _get_aligner_subword_aligns(
) -> torch.Tensor:
aligner = get_aligner_model()
tokenizer = get_aligner_tokenizer()
tok_aenized = [tokenizer.tokenize(word) for word in src]
tok_benized = [tokenizer.tokenize(word) for word in tgt]
ids_src, sub2word_map_src = _preprocess_sequence_for_alignment(tok_aenized)
ids_tgt, sub2word_map_tgt = _preprocess_sequence_for_alignment(tok_benized)
tokenized_src = [tokenizer.tokenize(word) for word in src]
tokenized_tgt = [tokenizer.tokenize(word) for word in tgt]
ids_src, sub2word_map_src = _preprocess_sequence_for_alignment(tokenized_src)
ids_tgt, sub2word_map_tgt = _preprocess_sequence_for_alignment(tokenized_tgt)
aligner.eval()
with torch.no_grad():
out_src = aligner(ids_src.unsqueeze(0), output_hidden_states=True)[2][align_layer][0, 1:-1]
Expand Down

0 comments on commit a1bb881

Please sign in to comment.