Skip to content

Feature/shardformer support GPT-J model auto sharding#4676

Merged
FrankLeeeee merged 13 commits intohpcaitech:feature/shardformerfrom
ppt0011:feature/shardformer
Sep 13, 2023
Merged

Feature/shardformer support GPT-J model auto sharding#4676
FrankLeeeee merged 13 commits intohpcaitech:feature/shardformerfrom
ppt0011:feature/shardformer

Conversation

@ppt0011
Copy link
Copy Markdown
Contributor

@ppt0011 ppt0011 commented Sep 11, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

#4675

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

📝 What does this PR do?

add GPT-J model support for shard former

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@ppt0011 ppt0011 changed the title Feature/shardformer Feature/shardformer support GPT-J model auto sharding Sep 11, 2023
@FrankLeeeee FrankLeeeee merged commit eb0a068 into hpcaitech:feature/shardformer Sep 13, 2023
@ppt0011
Copy link
Copy Markdown
Contributor Author

ppt0011 commented Sep 27, 2023

this merge is reversed due to upstream branch reset. it is now part of #4825

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants