Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FIX Raise an error when performing mixed adapter inference and passing non-existing adapter names #2090

Conversation

BenjaminBossan
Copy link
Member

PEFT allows mixed batch adapter inference, i.e. when predicting, the same batch can use different adapters by passing the adapter_names argument. However, when users pass an adapter name that does not correspond to any of the existing adapters, these samples are currently being ignored (i.e. just the base model output is used). This is unexpected and can easily lead to errors, e.g. when users mistype the name of an adapter.

This PR fixes this issue by checking all the existing adapter names first and comparing them to the adapter_names that the user passed. If there are unexpected entries, an error is raised.

Due to this fix, an error in the test
test_mixed_adapter_batches_lora_merged_raises was discovered and promptly fixed.

PEFT allows mixed batch adapter inference, i.e. when predicting, the
same batch can use different adapters by passing the adapter_names
argument. However, when users pass an adapter name that does not
correspond to any of the existing adapters, these samples are currently
being ignored (i.e. just the base model output is used). This is
unexpected and can easily lead to errors, e.g. when users mistype the
name of an adapter.

This PR fixes this issue by checking all the existing adapter names
first and comparing them to the adapter_names that the user passed. If
there are unexpected entries, an error is raised.

Due to this fix, an error in the test
test_mixed_adapter_batches_lora_merged_raises was discovered and
promptly fixed.
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan
Copy link
Member Author

ping @SunMarc Could I get a review please, it should hopefully not take long.

Copy link
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM ! Thanks for the ping, I completely missed this PR.

@BenjaminBossan
Copy link
Member Author

No worries, that happens sometimes. Thanks a lot for the review.

@BenjaminBossan BenjaminBossan merged commit 8efa0cb into huggingface:main Oct 9, 2024
14 checks passed
@BenjaminBossan BenjaminBossan deleted the fix-raise-error-mixed-batch-inference-with-non-existing-adapters branch October 9, 2024 13:53
BenjaminBossan added a commit to BenjaminBossan/peft that referenced this pull request Oct 22, 2024
PEFT allows mixed batch adapter inference, i.e. when predicting, the
same batch can use different adapters by passing the adapter_names
argument. However, when users pass an adapter name that does not
correspond to any of the existing adapters, these samples are currently
being ignored (i.e. just the base model output is used). This is
unexpected and can easily lead to errors, e.g. when users mistype the
name of an adapter.

This PR fixes this issue by checking all the existing adapter names
first and comparing them to the adapter_names that the user passed. If
there are unexpected entries, an error is raised.

Due to this fix, an error in the test
test_mixed_adapter_batches_lora_merged_raises was discovered and
promptly fixed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants