Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixing name position_embeddings to object_queries #24652

Merged

Conversation

Lorenzobattistela
Copy link
Contributor

What does this PR do?

This PR refers to #19833 , and it just update some variables/docstrings names. Quoting the Issue, the paper mentions that the position_embeddings argument of the cross-attention layer are these input embeddings called object queries. And the key_value_position_embeddings is refered to as spatial_position_embeddings.

Reopening PR #23091

This PR is limited to DETR model.

Notes

This is my first contribution, so I'm happy to adjust anything in this PR. I ran all tests and style, and it went all, except for one:
make fixup. I got the following output:

image

Reading the output, I assume it is about other file using classes in modeling_detr. I'll wait for updates. I will also wait for review for doc updating or more guidance.

Fixes # (issue)

Before submitting

Who can review?

@NielsRogge
@amyeroberts

@amyeroberts
Copy link
Collaborator

@Lorenzobattistela For the repo consistency and quality checks, you'll need to run make fix-copies and then make style and push any changes made

@Lorenzobattistela
Copy link
Contributor Author

@amyeroberts Done, just updated with the changes for repo consistency and quality. I don't know why, but testing pipelines and torch tests are failling within the installation step (but I did not changed anything related to it), and the test_worflow also failed just for torch. I'll wait for next instructions. Thanks!

@amyeroberts
Copy link
Collaborator

@Lorenzobattistela hmmmm, interesting. Could you try rebasing on main?

Some of the tests are failing because of the changes in this PR: https://app.circleci.com/pipelines/github/huggingface/transformers/67974/workflows/6a69bd9f-d35a-4964-868b-14fdd921d813/jobs/850696

Once these are resolved, ping me again and I can review :)

@Lorenzobattistela Lorenzobattistela force-pushed the position-embedding-detr branch 2 times, most recently from df63b45 to 7af5048 Compare July 13, 2023 18:52
@Lorenzobattistela
Copy link
Contributor Author

@amyeroberts Sorry for bothering, but I'm having a hard time with the circleCi testing. So, I'm having problems on repo consistency (as you mentioned before), but if I do run the script make fix-copies it change other models files (3 of them), and I think this would be scaping the Issue scope.

About the tests, I'm getting the following output:

FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_attention_outputs - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_determinism - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_detr_model - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_detr_no_timm_backbone - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_detr_object_detection_head_model - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_different_timm_backbone - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_feed_forward_chunking - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_greyscale_images - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_hidden_states_output - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_retain_grad_hidden_states_attentions - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_save_load - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
FAILED tests/models/detr/test_modeling_detr.py::DetrModelTest::test_training - RuntimeError: The size of tensor a (12) must match the size of tensor b (49) at non-singleton dimension 1
=== 12 failed, 1419 passed, 2461 skipped, 144 warnings in 163.96s (0:02:43) 

The funny thing is that I did not changed anything related to tensor sizes, since it was just naming convention

@amyeroberts
Copy link
Collaborator

@Lorenzobattistela No worries, you're not bothering at all :)

if I do run the script make fix-copies it change other models files (3 of them), and I think this would be scaping the Issue scope.

It's OK, we do want the changes made by make fix-copies included in this PR. make fix-copies makes sure that changes to the code are propagated across to all part of the codebase where the logic has been copied without the tedium or riskiness of doing it manually. This allows us to keep the one file per model pattern in the library.

The funny thing is that I did not changed anything related to tensor sizes, since it was just naming convention

Hmmm, funny. It might be that there's a var somewhere still needing it's name changed, or it could be how the model's being called in the tests. I'd suggest picking just one test and run that with the debugger to find where the issue is coming from i.e.

pytest tests/models/detr/test_modeling_detr.py::DetrModelTest::test_attention_outputs --pdb

and comparing the tensor shapes with and without the changes in this PR to track where they're coming from.

@Lorenzobattistela
Copy link
Contributor Author

@amyeroberts Got it working! It was a problem with make fix-copies, so some other files had to change to keep consistency and pass up the tests. Now it's all set!

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for your work adding this!

Main comment is about making sure the public methods are backwards compatible. Other than that the PR is looking v. nice and clean 🤗

src/transformers/models/detr/modeling_detr.py Show resolved Hide resolved
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@Lorenzobattistela
Copy link
Contributor Author

Lorenzobattistela commented Jul 25, 2023

@amyeroberts finished doing what was discussed. I think we can also think about refactoring and add it as a function, something like check_kwargs() , idk.
Because it was mostly duplicated accross all files. What do you think about it?

Weird, the error on CI has nothing to do with the files changed, its on other model

Copy link
Collaborator

@amyeroberts amyeroberts left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for all your work on this and iterating. Looks great!

@huggingface huggingface deleted a comment from github-actions bot Aug 28, 2023
@amyeroberts amyeroberts merged commit 99c3d44 into huggingface:main Aug 29, 2023
3 checks passed
@ydshieh ydshieh mentioned this pull request Sep 5, 2023
parambharat pushed a commit to parambharat/transformers that referenced this pull request Sep 26, 2023
* fixing name position_embeddings to object_queries

* [fix] renaming variable and docstring do object queries

* [fix] comment position_embedding to object queries

* [feat] changes from make-fix-copies to keep consistency

* Revert "[feat] changes from make-fix-copies to keep consistency"

This reverts commit 56e3e9e.

* [tests] fix wrong expected score

* [fix] wrong assignment causing wrong tensor shapes

* [fix] fixing position_embeddings to object queries to keep consistency (make fix copies)

* [fix] make fix copies, renaming position_embeddings to object_queries

* [fix] positional_embeddingss to object queries, fixes from make fix copies

* [fix] comments frmo make fix copies

* [fix] adding args validation to keep version support

* [fix] adding args validation to keep version support -conditional detr

* [fix] adding args validation to keep version support - maskformer

* [style] make fixup style fixes

* [feat] adding args checking

* [feat] fixcopies and args checking

* make fixup

* make fixup

---------

Co-authored-by: Lorenzobattistela <lorenzobattistela@gmail.com>
blbadger pushed a commit to blbadger/transformers that referenced this pull request Nov 8, 2023
* fixing name position_embeddings to object_queries

* [fix] renaming variable and docstring do object queries

* [fix] comment position_embedding to object queries

* [feat] changes from make-fix-copies to keep consistency

* Revert "[feat] changes from make-fix-copies to keep consistency"

This reverts commit 56e3e9e.

* [tests] fix wrong expected score

* [fix] wrong assignment causing wrong tensor shapes

* [fix] fixing position_embeddings to object queries to keep consistency (make fix copies)

* [fix] make fix copies, renaming position_embeddings to object_queries

* [fix] positional_embeddingss to object queries, fixes from make fix copies

* [fix] comments frmo make fix copies

* [fix] adding args validation to keep version support

* [fix] adding args validation to keep version support -conditional detr

* [fix] adding args validation to keep version support - maskformer

* [style] make fixup style fixes

* [feat] adding args checking

* [feat] fixcopies and args checking

* make fixup

* make fixup

---------

Co-authored-by: Lorenzobattistela <lorenzobattistela@gmail.com>
EduardoPach pushed a commit to EduardoPach/transformers that referenced this pull request Nov 18, 2023
* fixing name position_embeddings to object_queries

* [fix] renaming variable and docstring do object queries

* [fix] comment position_embedding to object queries

* [feat] changes from make-fix-copies to keep consistency

* Revert "[feat] changes from make-fix-copies to keep consistency"

This reverts commit 56e3e9e.

* [tests] fix wrong expected score

* [fix] wrong assignment causing wrong tensor shapes

* [fix] fixing position_embeddings to object queries to keep consistency (make fix copies)

* [fix] make fix copies, renaming position_embeddings to object_queries

* [fix] positional_embeddingss to object queries, fixes from make fix copies

* [fix] comments frmo make fix copies

* [fix] adding args validation to keep version support

* [fix] adding args validation to keep version support -conditional detr

* [fix] adding args validation to keep version support - maskformer

* [style] make fixup style fixes

* [feat] adding args checking

* [feat] fixcopies and args checking

* make fixup

* make fixup

---------

Co-authored-by: Lorenzobattistela <lorenzobattistela@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants