Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added use_liger flag to Trainer #32889

Closed
wants to merge 12 commits into from

Conversation

shimizust
Copy link
Contributor

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

docs/source/en/trainer.md Outdated Show resolved Hide resolved
src/transformers/training_args.py Outdated Show resolved Hide resolved
src/transformers/training_args.py Outdated Show resolved Hide resolved
@amyeroberts
Copy link
Collaborator

cc @muellerzr @SunMarc

Copy link
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work @shimizust ! Very excited to have this on transformers Trainer API ! Could you fix the CI tests by typing make style or/and rebase from main + add a few simple tests to check that layers were indeed replaced and check that we are able to train without errors (check the TrainerIntegrationTest in test_trainer.py file )?

)
```

Currently, the kernel supports the Llama, Gemma, Mistral, and Mixtral model architectures. A list of supported models is defined [here](https://github.com/linkedin/Liger-Kernel/blob/main/src/liger_kernel/transformers/monkey_patch.py#L7). When use_liger is set to True, the corresponding layers in the original model will be patched with Liger's efficient implementation, so you don't need to do anything extra other than setting the argument value.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it is better to link the readme and put the supported models there ? Right now, you are linking to a specific line in the monkey_patch file, not sure if this will change in the future.

docs/source/en/trainer.md Outdated Show resolved Hide resolved
docs/source/en/trainer.md Outdated Show resolved Hide resolved
src/transformers/trainer.py Outdated Show resolved Hide resolved
model_type = getattr(model, "config", None) and getattr(model.config, "model_type", None):
if model_type:
# Monkey patch the model with liger kernels. Use the default kernel configurations.
_apply_liger_kernel(model_type=model_type)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice !

@SunMarc
Copy link
Member

SunMarc commented Aug 20, 2024

Quick question @shimizust , I see that there is already another PR opened here. Which one should we focus on ?

shimizust and others added 6 commits August 20, 2024 09:35
Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
Co-authored-by: Byron Hsu <byronhsu1230@gmail.com>
Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
Co-authored-by: Byron Hsu <byronhsu1230@gmail.com>
Co-authored-by: Byron Hsu <byronhsu1230@gmail.com>
@shimizust
Copy link
Contributor Author

Quick question @shimizust , I see that there is already another PR opened here. Which one should we focus on ?

Hi @SunMarc thanks for the review! I'm going to incorporate your suggestions and merge into the original PR and close this one

@shimizust
Copy link
Contributor Author

Using this PR instead: #32860

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants