Skip to content

[shardformer] Add layernorm#4072

Merged
FrankLeeeee merged 7 commits intohpcaitech:feature/shardformerfrom
FoolPlayer:add-layernorm
Jun 23, 2023
Merged

[shardformer] Add layernorm#4072
FrankLeeeee merged 7 commits intohpcaitech:feature/shardformerfrom
FoolPlayer:add-layernorm

Conversation

@FoolPlayer
Copy link
Copy Markdown
Contributor

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Link this PR to your issue with words like fixed to automatically close the linked issue upon merge

e.g. fixed #1234, closed #1234, resolved #1234
fixed #4071

📝 What does this PR do?

Summarize your work here.
if you have any plots/diagrams/screenshots/tables, please attach them here.
add layernorm and fit it to all supported models

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@FoolPlayer FoolPlayer linked an issue Jun 22, 2023 that may be closed by this pull request
@FoolPlayer
Copy link
Copy Markdown
Contributor Author

image

@FrankLeeeee FrankLeeeee merged commit 21bd48b into hpcaitech:feature/shardformer Jun 23, 2023
FrankLeeeee pushed a commit that referenced this pull request Jun 26, 2023
* add layernorm to bert

* add layernorm test

* add layernorm test with load state dict

* add use_mixedfusedLN in shard config

* refactor policy to support fused_layernorm
flybird11111 pushed a commit to flybird11111/ColossalAI that referenced this pull request Jul 3, 2023
* add layernorm to bert

* add layernorm test

* add layernorm test with load state dict

* add use_mixedfusedLN in shard config

* refactor policy to support fused_layernorm
FrankLeeeee pushed a commit that referenced this pull request Jul 4, 2023
* add layernorm to bert

* add layernorm test

* add layernorm test with load state dict

* add use_mixedfusedLN in shard config

* refactor policy to support fused_layernorm
ver217 pushed a commit to ver217/ColossalAI that referenced this pull request Jul 13, 2023
* add layernorm to bert

* add layernorm test

* add layernorm test with load state dict

* add use_mixedfusedLN in shard config

* refactor policy to support fused_layernorm
@FoolPlayer FoolPlayer deleted the add-layernorm branch September 7, 2023 11:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[shardformer] add layernorm

2 participants