Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] pad_sequence with return_mask=True depends on padding_value (incorrect behavior for torch.nan) #1183

Closed
rwicl opened this issue Jan 15, 2025 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@rwicl
Copy link

rwicl commented Jan 15, 2025

Describe the bug

pad_sequence, called with padding_value=torch.nan and return_mask=True, creates masks in the output tensordict with True-values only, even if there are non-valid entries in the padded tensordict.

To Reproduce

import tensordict
import torch
from tensordict import pad_sequence, TensorDict

tensordict1 = TensorDict({'a': torch.tensor([1., 2., 3.])})
tensordict2 = TensorDict({'a': torch.tensor([4., 5.])})

batch_nan = pad_sequence([tensordict1, tensordict2], padding_value=torch.nan, return_mask=True)
batch_0 = pad_sequence([tensordict1, tensordict2], padding_value=0., return_mask=True)

print(batch_nan['masks']['a'].all())
print(batch_0['masks']['a'].all())

printstensor(True) and tensor(False)

Expected behavior

I expect tensor(False) to be printed in both cases.

@rwicl rwicl added the bug Something isn't working label Jan 15, 2025
@rwicl
Copy link
Author

rwicl commented Jan 15, 2025

Maybe this behavior makes sense? All entries are valid in the sense that one can read from the nan's their padding-origin directly.

@vmoens
Copy link
Contributor

vmoens commented Jan 15, 2025

You're absolutely correct, this is a bug and should be fixed by #1185

@vmoens vmoens closed this as completed Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants