Skip to content

[moe] fix ci#4772

Merged
ver217 merged 32 commits intohpcaitech:feature/MoEfrom
oahzxl:ci
Sep 22, 2023
Merged

[moe] fix ci#4772
ver217 merged 32 commits intohpcaitech:feature/MoEfrom
oahzxl:ci

Conversation

@oahzxl
Copy link
Copy Markdown
Contributor

@oahzxl oahzxl commented Sep 21, 2023

fix ci by changing kernel init

@github-actions
Copy link
Copy Markdown
Contributor

The code coverage for the changed files is 36%.

Click me to view the complete report
Name                                                                 Stmts   Miss  Cover
----------------------------------------------------------------------------------------
colossalai/booster/plugin/hybrid_parallel_plugin.py                    214     15    93%
colossalai/booster/plugin/moe_hybrid_parallel_plugin.py                 45     45     0%
colossalai/context/__init__.py                                           6      0   100%
colossalai/context/random/__init__.py                                    2      0   100%
colossalai/context/random/_helper.py                                    46     25    46%
colossalai/initialize.py                                               180    134    26%
colossalai/kernel/triton/__init__.py                                    14     14     0%
colossalai/kernel/triton/llama_act_combine_kernel.py                    89     89     0%
colossalai/legacy/engine/gradient_handler/__init__.py                    6      0   100%
colossalai/legacy/engine/gradient_handler/_moe_gradient_handler.py      20     20     0%
colossalai/moe/__init__.py                                               6      6     0%
colossalai/moe/_operation.py                                           139    139     0%
colossalai/moe/checkpoint.py                                            56     56     0%
colossalai/moe/experts.py                                               90     90     0%
colossalai/moe/layers.py                                                75     75     0%
colossalai/moe/loss.py                                                  21     21     0%
colossalai/moe/manager.py                                               78     78     0%
colossalai/moe/routers.py                                              176    176     0%
colossalai/moe/utils.py                                                 81     81     0%
colossalai/nn/layer/moe/__init__.py                                     12      0   100%
colossalai/nn/loss/__init__.py                                           0      0   100%
colossalai/tensor/moe_tensor/__init__.py                                 0      0   100%
colossalai/tensor/moe_tensor/api.py                                     24      9    62%
colossalai/tensor/moe_tensor/moe_info.py                                13     10    23%
colossalai/zero/low_level/low_level_optim.py                           355     34    90%
----------------------------------------------------------------------------------------
TOTAL                                                                 1748   1117    36%

@ver217 ver217 merged commit c54746d into hpcaitech:feature/MoE Sep 22, 2023
@oahzxl oahzxl deleted the ci branch September 25, 2023 02:55
oahzxl added a commit to oahzxl/ColossalAI that referenced this pull request Oct 26, 2023
* init policy

* renam,e

* update pp

* finish pp

* update script

* update plugin

* finish pp

* update setup for different plugin

* update ci

* update ci

* update ci

* support ep inside or dp inside

* update arg for kernel

* disable ci

* update train script

* fsdp

* update train

* update train

* fsdp benchmark

* rename

* update fsdp bench

* fix plugin

* update benchmark

* fix ci

* fix ci

* rename

* update ci

* update test

* update vocab

* update chunk head
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants