Skip to content

fix typo colossalai/auto_parallel autochunk fx/passes etc.#3808

Merged
FrankLeeeee merged 5 commits intohpcaitech:mainfrom
digger-yu:patch1
May 24, 2023
Merged

fix typo colossalai/auto_parallel autochunk fx/passes etc.#3808
FrankLeeeee merged 5 commits intohpcaitech:mainfrom
digger-yu:patch1

Conversation

@digger-yu
Copy link
Copy Markdown
Contributor

@digger-yu digger-yu commented May 23, 2023

📌 Checklist before creating the PR

  • I have created an issue for this PR for traceability
  • The title follows the standard format: [doc/gemini/tensor/...]: A concise description
  • I have added relevant tags if possible for us to better distinguish different PRs

🚨 Issue number

Not involved

📝 What does this PR do?

fix typo colossalai/auto_parallel autochunk fx/passes etc.
detail info
modified: .github/workflows/README.md
modified: colossalai/auto_parallel/passes/meta_info_prop.py
modified: colossalai/auto_parallel/tensor_shard/node_handler/strategy/batch_norm_generator.py
modified: colossalai/auto_parallel/tensor_shard/node_handler/strategy/conv_strategy_generator.py
modified: colossalai/auto_parallel/tensor_shard/node_handler/strategy/layer_norm_generator.py
modified: colossalai/auto_parallel/tensor_shard/node_handler/strategy/normal_pooling_generator.py
modified: colossalai/autochunk/trace_flow.py
modified: colossalai/autochunk/trace_indice.py
modified: colossalai/booster/plugin/gemini_plugin.py
modified: colossalai/cluster/dist_coordinator.py
modified: colossalai/device/alpha_beta_profiler.py
modified: colossalai/engine/schedule/_pipeline_schedule.py
modified: colossalai/engine/schedule/_pipeline_schedule_v2.py
modified: colossalai/fx/codegen/activation_checkpoint_codegen.py
modified: colossalai/fx/passes/adding_split_node_pass.py
modified: colossalai/fx/passes/experimental/adding_shape_consistency_pass.py
modified: colossalai/fx/passes/meta_info_prop.py
modified: colossalai/fx/passes/passes_for_gpt2_test.py
modified: colossalai/fx/passes/split_module.py

💥 Checklist before requesting a review

  • I have linked my PR to an issue (instruction)
  • My issue clearly describes the problem/feature/proposal, with diagrams/charts/table/code if possible
  • I have performed a self-review of my code
  • I have added thorough tests.
  • I have added docstrings for all the functions/methods I implemented

⭐️ Do you enjoy contributing to Colossal-AI?

  • 🌝 Yes, I do.
  • 🌚 No, I don't.

Tell us more if you don't enjoy contributing to Colossal-AI.

@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is %.

Click me to view the complete report
Name                                                                                      Stmts   Miss  Cover
-------------------------------------------------------------------------------------------------------------
colossalai/auto_parallel/passes/meta_info_prop.py                                            94     94     0%
colossalai/auto_parallel/tensor_shard/node_handler/strategy/batch_norm_generator.py         115     89    23%
colossalai/auto_parallel/tensor_shard/node_handler/strategy/conv_strategy_generator.py      206    172    17%
colossalai/auto_parallel/tensor_shard/node_handler/strategy/layer_norm_generator.py          99     79    20%
colossalai/auto_parallel/tensor_shard/node_handler/strategy/normal_pooling_generator.py      58     43    26%
colossalai/autochunk/trace_flow.py                                                          265     19    93%
colossalai/autochunk/trace_indice.py                                                        494     94    81%
colossalai/booster/plugin/gemini_plugin.py                                                  106      9    92%
colossalai/cluster/dist_coordinator.py                                                       63     31    51%
colossalai/device/alpha_beta_profiler.py                                                    209    191     9%
colossalai/engine/schedule/_pipeline_schedule.py                                            430    261    39%
colossalai/engine/schedule/_pipeline_schedule_v2.py                                          78     11    86%
colossalai/fx/codegen/activation_checkpoint_codegen.py                                      582    399    31%
colossalai/fx/passes/adding_split_node_pass.py                                              258    238     8%
colossalai/fx/passes/meta_info_prop.py                                                      101     29    71%
colossalai/fx/passes/passes_for_gpt2_test.py                                                246    246     0%
colossalai/fx/passes/split_module.py                                                        146    134     8%
-------------------------------------------------------------------------------------------------------------
TOTAL                                                                                      3550   2139    40%

@FrankLeeeee FrankLeeeee merged commit 7f8203a into hpcaitech:main May 24, 2023
@digger-yu digger-yu deleted the patch1 branch May 24, 2023 01:47
FrankLeeeee pushed a commit to FrankLeeeee/ColossalAI that referenced this pull request May 24, 2023
FrankLeeeee pushed a commit to FrankLeeeee/ColossalAI that referenced this pull request May 25, 2023
FrankLeeeee pushed a commit to FrankLeeeee/ColossalAI that referenced this pull request Jun 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants