Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

finetune : zero the loraB initial vectors #4082

Merged
merged 3 commits into from
Nov 17, 2023

Commits on Nov 15, 2023

  1. finetune : zero the loraB initial vectors

    Without this, the first iteration is starting out far from the base model, instead of exactly on it.
    Zeroing loraB is what the paper recommends. loralib also zeroes at least one of the init vector pairs
    (though it departs from the paper in using a different distribution for the other vector, in some cases).
    AndrewGodfrey committed Nov 15, 2023
    Configuration menu
    Copy the full SHA
    91eb335 View commit details
    Browse the repository at this point in the history
  2. tabs to spaces

    AndrewGodfrey committed Nov 15, 2023
    Configuration menu
    Copy the full SHA
    c72c1b3 View commit details
    Browse the repository at this point in the history
  3. Configuration menu
    Copy the full SHA
    4571bcc View commit details
    Browse the repository at this point in the history