Skip to content

[plugin] torch ddp plugin supports sharded model checkpoint#3775

Merged
ver217 merged 13 commits intohpcaitech:mainfrom
ver217:feature/ddp-sharded-ckpt
May 18, 2023
Merged

[plugin] torch ddp plugin supports sharded model checkpoint#3775
ver217 merged 13 commits intohpcaitech:mainfrom
ver217:feature/ddp-sharded-ckpt

Conversation

@ver217
Copy link
Copy Markdown
Contributor

@ver217 ver217 commented May 18, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

Closes #3773

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

  1. torch ddp plugin add save sharded model
  2. update related unit test

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@ver217 ver217 added the API related to API changes label May 18, 2023
@ver217 ver217 requested a review from FrankLeeeee May 18, 2023 08:49
@ver217 ver217 self-assigned this May 18, 2023
@ver217 ver217 requested a review from flybird11111 May 18, 2023 08:49
@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 86%.

Click me to view the complete report
Name                                                           Stmts   Miss  Cover
----------------------------------------------------------------------------------
colossalai/booster/plugin/torch_ddp_plugin.py                     63      5    92%
colossalai/checkpoint_io/checkpoint_io_base.py                    66     11    83%
colossalai/checkpoint_io/utils.py                                158     34    78%
tests/test_booster/test_plugin/test_low_level_zero_plugin.py      60      6    90%
tests/test_checkpoint_io/test_torch_ddp_checkpoint_io.py          60      0   100%
----------------------------------------------------------------------------------
TOTAL                                                            407     56    86%

@ver217 ver217 merged commit 5452df6 into hpcaitech:main May 18, 2023
@ver217 ver217 deleted the feature/ddp-sharded-ckpt branch May 18, 2023 12:06
FrankLeeeee pushed a commit that referenced this pull request May 24, 2023
* [plugin] torch ddp plugin add save sharded model

* [test] fix torch ddp ckpt io test

* [test] fix torch ddp ckpt io test

* [test] fix low level zero plugin test

* [test] fix low level zero plugin test

* [test] add debug info

* [test] add debug info

* [test] add debug info

* [test] add debug info

* [test] add debug info

* [test] fix low level zero plugin test

* [test] fix low level zero plugin test

* [test] remove debug info
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

API related to API changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[plugin] torch ddp plugin supports sharded model checkpoint

3 participants