Open
Conversation
Contributor
Author
|
/ok to test 9a8e019 |
9a8e019 to
71b3d1f
Compare
6 tasks
Contributor
Author
|
/ok to test 1a18019 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
What does this PR do ?
main pr #3431
PR Design / Implementation Doc
TL;DR
This PR refactors MoE metric logging from ad-hoc global state into a single structured tracker, making metric collection, distributed reduction, aggregation, and logging easier to reason about and safer to extend.
Problem Statement
Before this change, MoE logging logic was split across utility functions and mutable global dictionaries, which made behavior hard to follow and easy to break when adding new metrics or reduction rules.
Goals
Non-Goals
High-Level Design
A new singleton tracker
MoEMetricsTrackerinmegatron/core/transformer/moe/moe_logging.pymanages all MoE metric handling end-to-end:record(...)stores per-layer metric values._reduce_across_ranks(...)performs distributed reductions._aggregate_metrics(...)computes scalar summaries.write(...)emits scalars to TensorBoard / W&B.get_log_string(...)formats console output.clear()resets metric buffers each logging cycle.Per-metric metadata is stored in
_MetricEntry:valuesreduce_groupavg_groupneeds_dp_avglayer_percentilesKey Implementation Changes
megatron/core/transformer/moe/moe_logging.pymegatron/core/transformer/moe/router.pyMoEMetricsTracker.get_instance().record(...)needs_dp_avg=Falseforglobal_load_balancing_lossmegatron/training/training.pytrack(...)and appends returned MoE log string to stdout logmegatron/core/transformer/cuda_graphs.pyMoEMetricsTracker.get_instance().clear()megatron/core/transformer/moe/moe_utils.pysave_to_aux_losses_tracker(...)forwards into trackertrack_moe_metrics(...)remains deprecated wrapperReduction Semantics (Important)
For each metric, tracker reduction order is:
reduce_groupall-reduceavg_groupAVG all-reduceneeds_dp_avgThis replaces implicit/fragile logic and makes DP averaging intent explicit per metric.
Contribution process
flowchart LR A[Pre-checks] --> B[PR Tests] subgraph Code Review/Approval C1[Expert Review] --> C2[Final Review] end B --> C1 C2 --> D[Merge]Pre-checks
Core 0.8)Code review
The following process is enforced via the CODEOWNERS file for changes into
megatron/core. For changes outside ofmegatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.For MRs into `main` branch
(Step 1): Add PR label
Expert Review(Step 2): Collect the expert reviewers reviews
Expert Reviewlabel when your PR is ready for review.Final Review might get declined if these requirements are not fulfilled.
(Step 3): Final Review
Final Reviewlabel(Optional Step 4): Cherry-pick into release branch
If this PR also needs to be merged into
core_r*release branches, after this PR has been merged, selectCherry-pickto open a new PR into the release branch.For MRs into `dev` branch
The proposed review process for `dev` branch is under active discussion.MRs are mergable after one approval by either
eharper@nvidia.comorzijiey@nvidia.com.Merging your PR
Any member of core-adlr and
core-nemowill be able to merge your PR.