Skip to content

[moe] support overlap for expert tp#4851

Merged
oahzxl merged 6 commits intohpcaitech:feature/MoEfrom
oahzxl:overlap
Oct 4, 2023
Merged

[moe] support overlap for expert tp#4851
oahzxl merged 6 commits intohpcaitech:feature/MoEfrom
oahzxl:overlap

Conversation

@oahzxl
Copy link
Copy Markdown
Contributor

@oahzxl oahzxl commented Oct 3, 2023

  • support overlap for expert tp
image

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Oct 3, 2023

The code coverage for the changed files is 36%.

Click me to view the complete report
Name                                                                 Stmts   Miss  Cover
----------------------------------------------------------------------------------------
colossalai/booster/plugin/hybrid_parallel_plugin.py                    214     15    93%
colossalai/booster/plugin/moe_hybrid_parallel_plugin.py                 62     62     0%
colossalai/context/__init__.py                                           6      0   100%
colossalai/context/random/__init__.py                                    2      0   100%
colossalai/context/random/_helper.py                                    46      9    80%
colossalai/initialize.py                                               180    133    26%
colossalai/kernel/triton/__init__.py                                    14      3    79%
colossalai/kernel/triton/llama_act_combine_kernel.py                    89     67    25%
colossalai/legacy/engine/gradient_handler/__init__.py                    6      0   100%
colossalai/legacy/engine/gradient_handler/_moe_gradient_handler.py      20     20     0%
colossalai/moe/__init__.py                                               6      6     0%
colossalai/moe/_operation.py                                           159    159     0%
colossalai/moe/checkpoint.py                                           133    133     0%
colossalai/moe/experts.py                                               90     90     0%
colossalai/moe/layers.py                                               104    104     0%
colossalai/moe/loss.py                                                  21     21     0%
colossalai/moe/manager.py                                               78     78     0%
colossalai/moe/routers.py                                              176    176     0%
colossalai/moe/utils.py                                                 81     81     0%
colossalai/nn/layer/moe/__init__.py                                     12      0   100%
colossalai/nn/loss/__init__.py                                           0      0   100%
colossalai/tensor/moe_tensor/__init__.py                                 0      0   100%
colossalai/tensor/moe_tensor/api.py                                     24      9    62%
colossalai/tensor/moe_tensor/moe_info.py                                13     10    23%
colossalai/zero/low_level/low_level_optim.py                           355     34    90%
tests/test_infer_ops/triton/test_llama_act_combine.py                   40     23    42%
----------------------------------------------------------------------------------------
TOTAL                                                                 1931   1233    36%

@oahzxl oahzxl merged commit 343f92e into hpcaitech:feature/MoE Oct 4, 2023
@oahzxl oahzxl deleted the overlap branch October 10, 2023 07:23
cwher pushed a commit to cwher/ColossalAI that referenced this pull request Oct 11, 2023
* overlap comm

* fix typo

* update bench script

* add option

* update script

* update bench
oahzxl added a commit to oahzxl/ColossalAI that referenced this pull request Oct 26, 2023
* overlap comm

* fix typo

* update bench script

* add option

* update script

* update bench
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants