Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bugfix] Fix BNB loader target_modules #10720

Merged
merged 5 commits into from
Dec 5, 2024

Conversation

jeejeelee
Copy link
Collaborator

@jeejeelee jeejeelee commented Nov 27, 2024

Motivation

When addressing the following TODO, I found a misconception regarding the use of target_modules. The issue arises because target_modules specifies which layers should have LoRA adapters added, rather than which layers should be quantized.

# TODO: target_modules could be either a list or a regex string.

Reproduce Code

from transformers import AutoModelForCausalLM
from peft import get_peft_model, LoraConfig, TaskType

model_name_or_path = "Llama-2-7b-chat-hf"
peft_config = LoraConfig(
    task_type=TaskType.CAUSAL_LM,
    inference_mode=True,
    r=8,
    lora_alpha=32,
    target_modules=[ # just add some layers to verify the effect of target_modules
        "gate_proj",
        "up_proj",
    ],
    use_rslora=False,
    use_dora=False,
)
model = AutoModelForCausalLM.from_pretrained(
    model_name_or_path, load_in_4bit=True
).eval()
model = get_peft_model(model, peft_config)
model.print_trainable_parameters()
print(model)

Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
@jeejeelee jeejeelee requested a review from mgoin November 27, 2024 16:54
Copy link

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which starts running only a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of those by going to your fastcheck build on Buildkite UI (linked in the PR checks section) and unblock them. If you do not have permission to unblock, ping simon-mo or khluu to add you in our Buildkite org.

Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging.

To run CI, PR reviewers can do one of these:

  • Add ready label to the PR
  • Enable auto-merge.

🚀

@jeejeelee
Copy link
Collaborator Author

jeejeelee commented Nov 30, 2024

@mgoin Sorry for bother you, if you have bandwidth ,please look at this PR, thanks

@mgoin mgoin added the ready ONLY add when PR is ready to merge/full CI is needed label Dec 4, 2024
@mgoin
Copy link
Member

mgoin commented Dec 4, 2024

Apologies for missing, this looks reasonable to me if the CI is green!

@jeejeelee jeejeelee requested a review from mgoin December 4, 2024 05:43
@jeejeelee jeejeelee merged commit 1f958a7 into vllm-project:main Dec 5, 2024
49 checks passed
@jeejeelee jeejeelee deleted the fix-bnb-target-modules branch December 5, 2024 05:20
sleepwalker2017 pushed a commit to sleepwalker2017/vllm that referenced this pull request Dec 13, 2024
Signed-off-by: Jee Jee Li <pandaleefree@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ready ONLY add when PR is ready to merge/full CI is needed
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants