Skip to content

Added use_liger flag to Trainer#32889

Closed
shimizust wants to merge 12 commits intohuggingface:mainfrom
shimizust:sshimizu/liger-kernel
Closed

Added use_liger flag to Trainer#32889
shimizust wants to merge 12 commits intohuggingface:mainfrom
shimizust:sshimizu/liger-kernel

Conversation

@shimizust
Copy link
Copy Markdown
Contributor

What does this PR do?

Fixes # (issue)

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

Anyone in the community is free to review the PR once the tests have passed. Feel free to tag
members/contributors who may be interested in your PR.

Comment thread docs/source/en/trainer.md Outdated
Comment thread src/transformers/training_args.py Outdated
Comment thread src/transformers/training_args.py Outdated
@amyeroberts
Copy link
Copy Markdown
Contributor

cc @muellerzr @SunMarc

Copy link
Copy Markdown
Member

@SunMarc SunMarc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work @shimizust ! Very excited to have this on transformers Trainer API ! Could you fix the CI tests by typing make style or/and rebase from main + add a few simple tests to check that layers were indeed replaced and check that we are able to train without errors (check the TrainerIntegrationTest in test_trainer.py file )?

Comment thread docs/source/en/trainer.md Outdated
)
```

Currently, the kernel supports the Llama, Gemma, Mistral, and Mixtral model architectures. A list of supported models is defined [here](https://github.com/linkedin/Liger-Kernel/blob/main/src/liger_kernel/transformers/monkey_patch.py#L7). When use_liger is set to True, the corresponding layers in the original model will be patched with Liger's efficient implementation, so you don't need to do anything extra other than setting the argument value.
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe it is better to link the readme and put the supported models there ? Right now, you are linking to a specific line in the monkey_patch file, not sure if this will change in the future.

Comment thread docs/source/en/trainer.md Outdated
Comment thread docs/source/en/trainer.md Outdated
Comment thread src/transformers/trainer.py Outdated
model_type = getattr(model, "config", None) and getattr(model.config, "model_type", None):
if model_type:
# Monkey patch the model with liger kernels. Use the default kernel configurations.
_apply_liger_kernel(model_type=model_type)
Copy link
Copy Markdown
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice !

@SunMarc
Copy link
Copy Markdown
Member

SunMarc commented Aug 20, 2024

Quick question @shimizust , I see that there is already another PR opened here. Which one should we focus on ?

shimizust and others added 6 commits August 20, 2024 09:35
Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
Co-authored-by: Byron Hsu <byronhsu1230@gmail.com>
Co-authored-by: Marc Sun <57196510+SunMarc@users.noreply.github.com>
Co-authored-by: Byron Hsu <byronhsu1230@gmail.com>
Co-authored-by: Byron Hsu <byronhsu1230@gmail.com>
@shimizust
Copy link
Copy Markdown
Contributor Author

Quick question @shimizust , I see that there is already another PR opened here. Which one should we focus on ?

Hi @SunMarc thanks for the review! I'm going to incorporate your suggestions and merge into the original PR and close this one

@shimizust
Copy link
Copy Markdown
Contributor Author

Using this PR instead: #32860

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants