Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the way modules are added #16

Closed
hwijeen opened this issue Nov 24, 2023 · 0 comments
Closed

Fix the way modules are added #16

hwijeen opened this issue Nov 24, 2023 · 0 comments
Labels
bug Something isn't working

Comments

@hwijeen
Copy link
Collaborator

hwijeen commented Nov 24, 2023

Transformer models stores modules in a peculiar way, using a mix of string and list. (e.g.model.model.bert.encoder.layer[0].attention.self.query).

Currently, it erroneously adds with setattr(model, name, lora_module)
like below and this added module is not used in the forward function.
image

@hwijeen hwijeen added enhancement New feature or request bug Something isn't working labels Nov 24, 2023
@hwijeen hwijeen changed the title Improve the way add modules are added Fix the way add modules are added Nov 24, 2023
@hwijeen hwijeen changed the title Fix the way add modules are added Fix the way modules are added Nov 24, 2023
@hwijeen hwijeen removed the enhancement New feature or request label Nov 24, 2023
@hwijeen hwijeen closed this as completed Nov 24, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant