Skip to content

fz#62

Merged
jamesthesnake merged 45 commits intojamesthesnake:lfrom
hpcaitech:main
Jun 13, 2023
Merged

fz#62
jamesthesnake merged 45 commits intojamesthesnake:lfrom
hpcaitech:main

Conversation

@jamesthesnake
Copy link
Copy Markdown
Owner

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

ver217 and others added 30 commits May 25, 2023 13:09
* [dtensor] polish sharding spec docstring

* [dtensor] polish sharding spec example docstring
[sync] sync feature/dtensor with develop
* [doc] add lazy init en doc

* [doc] add lazy init zh doc

* [doc] add lazy init doc in sidebar

* [doc] add lazy init doc test

* [doc] fix lazy init doc link
[feature] updated device mesh and dtensor
* init shardformer code structure

* add implement of sharder (inject and replace)

* add implement of replace layer to colossal layer

* separate different layer policy, add some notion

* implement 1d and 2d slicer, can tell col or row

* fix bug when slicing and inject model

* fix some bug; add inference test example
)

* init shardformer code structure

* add implement of sharder (inject and replace)

* add implement of replace layer to colossal layer

* separate different layer policy, add some notion

* implement 1d and 2d slicer, can tell col or row

* fix bug when slicing and inject model

* fix some bug; add inference test example

* add share weight and train example

* add train

* add docstring and readme

* add docstring for other files

* pre-commit
FrankLeeeee and others added 15 commits June 8, 2023 15:01
* [shardformer] refactored the user api

* polish code
* update readme with modules content

* remove img
)

* add dropout layer, add dropout test

* modify seed manager as context manager

* add a copy of col_nn.layer

* add dist_crossentropy loss; separate module test

* polish the code

* fix dist crossentropy loss
…3883)

* add gpt2 policy and modify shard and slicer to support

* remove unused code

* polish code
Co-authored-by: Yuanchen Xu <yuanchen.xu00@gmail.com>
[example] Adding an example of training dreambooth with the new booster API
[sync] sync feature/shardformer with develop
…hardformer

Revert "[sync] sync feature/shardformer with develop"
@jamesthesnake jamesthesnake merged commit eabae7a into jamesthesnake:l Jun 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

8 participants