Skip to content

[booster] make optimizer argument optional for boost#3993

Merged
ver217 merged 2 commits intohpcaitech:developfrom
cwher:feat/inference_only_boost
Jun 15, 2023
Merged

[booster] make optimizer argument optional for boost#3993
ver217 merged 2 commits intohpcaitech:developfrom
cwher:feat/inference_only_boost

Conversation

@cwher
Copy link
Copy Markdown
Contributor

@cwher cwher commented Jun 15, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

Close #3992.

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@cwher cwher requested a review from ver217 June 15, 2023 04:04
@cwher cwher added the API related to API changes label Jun 15, 2023
@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 88%.

Click me to view the complete report
Name                                                                  Stmts   Miss  Cover
-----------------------------------------------------------------------------------------
colossalai/booster/booster.py                                            63     10    84%
colossalai/booster/mixed_precision/fp16_torch.py                         46      2    96%
colossalai/booster/mixed_precision/mixed_precision_base.py                9      1    89%
colossalai/booster/plugin/gemini_plugin.py                              109      9    92%
colossalai/booster/plugin/low_level_zero_plugin.py                       94      8    91%
colossalai/booster/plugin/plugin_base.py                                 40     10    75%
colossalai/booster/plugin/torch_ddp_plugin.py                            63      3    95%
colossalai/booster/plugin/torch_fsdp_plugin.py                           97     13    87%
tests/test_autochunk/test_autochunk_diffuser/test_autochunk_unet.py      36     10    72%
-----------------------------------------------------------------------------------------
TOTAL                                                                   557     66    88%

@cwher cwher changed the base branch from main to develop June 15, 2023 07:38
@cwher cwher force-pushed the feat/inference_only_boost branch from afc2420 to 6c0859c Compare June 15, 2023 07:54
@cwher cwher force-pushed the feat/inference_only_boost branch from 6c0859c to b737c5f Compare June 15, 2023 08:03
@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 86%.

Click me to view the complete report
Name                                                                  Stmts   Miss  Cover
-----------------------------------------------------------------------------------------
colossalai/booster/booster.py                                            63     10    84%
colossalai/booster/mixed_precision/fp16_torch.py                         46      2    96%
colossalai/booster/mixed_precision/mixed_precision_base.py                9      1    89%
colossalai/booster/plugin/gemini_plugin.py                              110      9    92%
colossalai/booster/plugin/low_level_zero_plugin.py                       94      8    91%
colossalai/booster/plugin/plugin_base.py                                 40     10    75%
colossalai/booster/plugin/torch_ddp_plugin.py                            66      5    92%
colossalai/booster/plugin/torch_fsdp_plugin.py                           97     13    87%
colossalai/checkpoint_io/checkpoint_io_base.py                           66      9    86%
colossalai/checkpoint_io/general_checkpoint_io.py                       103      8    92%
colossalai/checkpoint_io/index_file.py                                   70     17    76%
colossalai/checkpoint_io/utils.py                                       241     45    81%
colossalai/zero/gemini/gemini_ddp.py                                    402     70    83%
tests/test_autochunk/test_autochunk_diffuser/test_autochunk_unet.py      36     10    72%
tests/test_checkpoint_io/test_general_checkpoint_io.py                  108      0   100%
-----------------------------------------------------------------------------------------
TOTAL                                                                  1551    217    86%

@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 14%.

Click me to view the complete report
Name                                                                  Stmts   Miss  Cover
-----------------------------------------------------------------------------------------
colossalai/booster/booster.py                                            63     63     0%
colossalai/booster/mixed_precision/fp16_torch.py                         46     46     0%
colossalai/booster/mixed_precision/mixed_precision_base.py                9      9     0%
colossalai/booster/plugin/gemini_plugin.py                              110    110     0%
colossalai/booster/plugin/low_level_zero_plugin.py                       94     94     0%
colossalai/booster/plugin/plugin_base.py                                 40     40     0%
colossalai/booster/plugin/torch_ddp_plugin.py                            66     66     0%
colossalai/booster/plugin/torch_fsdp_plugin.py                           97     97     0%
colossalai/checkpoint_io/checkpoint_io_base.py                           66     33    50%
colossalai/checkpoint_io/general_checkpoint_io.py                       103     82    20%
colossalai/checkpoint_io/index_file.py                                   70     48    31%
colossalai/checkpoint_io/utils.py                                       241    200    17%
colossalai/zero/gemini/gemini_ddp.py                                    402    348    13%
tests/test_autochunk/test_autochunk_diffuser/test_autochunk_unet.py      36     10    72%
-----------------------------------------------------------------------------------------
TOTAL                                                                  1443   1246    14%

@ver217 ver217 merged commit 725af3e into hpcaitech:develop Jun 15, 2023
@cwher cwher deleted the feat/inference_only_boost branch June 27, 2023 05:41
ver217 pushed a commit to ver217/ColossalAI that referenced this pull request Jul 13, 2023
* feat: make optimizer optional in Booster.boost

* test: skip unet test if diffusers version > 0.10.2
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

API related to API changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[booster] make optimizer argument optional for boost

2 participants