Skip to content

[shardformer] integrated linear 1D with dtensor#3996

Merged
FrankLeeeee merged 2 commits intohpcaitech:refactor/shardformerfrom
FrankLeeeee:refactor/dist-linear-1d
Jun 15, 2023
Merged

[shardformer] integrated linear 1D with dtensor#3996
FrankLeeeee merged 2 commits intohpcaitech:refactor/shardformerfrom
FrankLeeeee:refactor/dist-linear-1d

Conversation

@FrankLeeeee
Copy link
Copy Markdown
Contributor

@FrankLeeeee FrankLeeeee commented Jun 15, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234

Part of fix to #3997

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.

This PR integrated the DTensor into the Linear 1D Column module and conducted some tests.

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@FrankLeeeee FrankLeeeee linked an issue Jun 15, 2023 that may be closed by this pull request
@FrankLeeeee
Copy link
Copy Markdown
Contributor Author

Test screenshot:

Screenshot 2023-06-15 at 16 25 12

@FrankLeeeee FrankLeeeee force-pushed the refactor/dist-linear-1d branch from 2e4a1a3 to c567a83 Compare June 15, 2023 09:35
@FrankLeeeee FrankLeeeee changed the title [shardformer] integrated linear 1d col with dtensor [shardformer] integrated linear 1D with dtensor Jun 15, 2023
@FrankLeeeee FrankLeeeee merged commit 09659e9 into hpcaitech:refactor/shardformer Jun 15, 2023
@FrankLeeeee FrankLeeeee deleted the refactor/dist-linear-1d branch June 15, 2023 10:03
FrankLeeeee added a commit to FrankLeeeee/ColossalAI that referenced this pull request Jun 16, 2023
* [shardformer] integrated linear 1D with dtensor

* polish code
FrankLeeeee added a commit that referenced this pull request Jun 16, 2023
* [shardformer] integrated linear 1D with dtensor

* polish code
FrankLeeeee added a commit that referenced this pull request Jun 26, 2023
* [shardformer] integrated linear 1D with dtensor

* polish code
flybird11111 pushed a commit to flybird11111/ColossalAI that referenced this pull request Jul 3, 2023
* [shardformer] integrated linear 1D with dtensor

* polish code
FrankLeeeee added a commit that referenced this pull request Jul 4, 2023
* [shardformer] integrated linear 1D with dtensor

* polish code
ver217 pushed a commit to ver217/ColossalAI that referenced this pull request Jul 13, 2023
* [shardformer] integrated linear 1D with dtensor

* polish code
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[shardforme] integrate dtensor into shardformer

2 participants