Skip to content

[bf16] implement mixed precision mixin and add bf16 support for low level zero#3869

Merged
FrankLeeeee merged 4 commits intohpcaitech:feature/bf16from
ver217:feature/mixed-precision
May 31, 2023
Merged

[bf16] implement mixed precision mixin and add bf16 support for low level zero#3869
FrankLeeeee merged 4 commits intohpcaitech:feature/bf16from
ver217:feature/mixed-precision

Conversation

@ver217
Copy link
Copy Markdown
Contributor

@ver217 ver217 commented May 30, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

Closes #3863

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

  1. Implement mixed precision mixin class described in [FEATURE]: mixed precision mixin #3863 . FP16 and BF16 are supported now.
  2. Add bf16 support for low level zero optimizer.
  3. Update related tests.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@ver217 ver217 added the enhancement New feature or request label May 30, 2023
@ver217 ver217 requested a review from FrankLeeeee May 30, 2023 10:07
@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 95%.

Click me to view the complete report
Name                                                         Stmts   Miss  Cover
--------------------------------------------------------------------------------
colossalai/amp/naive_amp/mixed_precision_mixin/__init__.py       4      0   100%
colossalai/amp/naive_amp/mixed_precision_mixin/base.py          20      5    75%
colossalai/amp/naive_amp/mixed_precision_mixin/bf16.py          15      1    93%
colossalai/amp/naive_amp/mixed_precision_mixin/fp16.py          49      3    94%
colossalai/nn/optimizer/cpu_adam.py                             68      3    96%
colossalai/nn/optimizer/fused_adam.py                           55     10    82%
colossalai/nn/optimizer/hybrid_adam.py                          63      3    95%
colossalai/zero/low_level/low_level_optim.py                   314     20    94%
tests/test_optimizer/test_adam_kernel.py                        87      1    99%
tests/test_optimizer/test_adam_optim.py                         49      0   100%
tests/test_zero/test_low_level/test_grad_acc.py                 88      1    99%
tests/test_zero/test_low_level/test_zero1_2.py                  90      1    99%
--------------------------------------------------------------------------------
TOTAL                                                          902     48    95%

@FrankLeeeee FrankLeeeee merged commit d175d92 into hpcaitech:feature/bf16 May 31, 2023
@ver217 ver217 deleted the feature/mixed-precision branch May 31, 2023 05:35
FrankLeeeee pushed a commit that referenced this pull request Jun 5, 2023
* [bf16] add bf16 support for fused adam (#3844)

* [bf16] fused adam kernel support bf16

* [test] update fused adam kernel test

* [test] update fused adam test

* [bf16] cpu adam and hybrid adam optimizers support bf16 (#3860)

* [bf16] implement mixed precision mixin and add bf16 support for low level zero (#3869)

* [bf16] add mixed precision mixin

* [bf16] low level zero optim support bf16

* [text] update low level zero test

* [text] fix low level zero grad acc test

* [bf16] add bf16 support for gemini (#3872)

* [bf16] gemini support bf16

* [test] update gemini bf16 test

* [doc] update gemini docstring

* [bf16] add bf16 support for plugins (#3877)

* [bf16] add bf16 support for legacy zero (#3879)

* [zero] init context support bf16

* [zero] legacy zero support bf16

* [test] add zero bf16 test

* [doc] add bf16 related docstring for legacy zero
FrankLeeeee pushed a commit to FrankLeeeee/ColossalAI that referenced this pull request Jun 7, 2023
* [bf16] add bf16 support for fused adam (hpcaitech#3844)

* [bf16] fused adam kernel support bf16

* [test] update fused adam kernel test

* [test] update fused adam test

* [bf16] cpu adam and hybrid adam optimizers support bf16 (hpcaitech#3860)

* [bf16] implement mixed precision mixin and add bf16 support for low level zero (hpcaitech#3869)

* [bf16] add mixed precision mixin

* [bf16] low level zero optim support bf16

* [text] update low level zero test

* [text] fix low level zero grad acc test

* [bf16] add bf16 support for gemini (hpcaitech#3872)

* [bf16] gemini support bf16

* [test] update gemini bf16 test

* [doc] update gemini docstring

* [bf16] add bf16 support for plugins (hpcaitech#3877)

* [bf16] add bf16 support for legacy zero (hpcaitech#3879)

* [zero] init context support bf16

* [zero] legacy zero support bf16

* [test] add zero bf16 test

* [doc] add bf16 related docstring for legacy zero
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants