We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transformer models stores modules in a peculiar way, using a mix of string and list. (e.g.model.model.bert.encoder.layer[0].attention.self.query).
model.model.bert.encoder.layer[0].attention.self.query
Currently, it erroneously adds with setattr(model, name, lora_module) like below and this added module is not used in the forward function.
setattr(model, name, lora_module)
forward
The text was updated successfully, but these errors were encountered:
No branches or pull requests
Transformer models stores modules in a peculiar way, using a mix of string and list. (e.g.
model.model.bert.encoder.layer[0].attention.self.query
).Currently, it erroneously adds with
setattr(model, name, lora_module)
like below and this added module is not used in the
forward
function.The text was updated successfully, but these errors were encountered: