Skip to content

Fused dLN + add in backwards pass#3384

Merged
asolergi-nv merged 32 commits intoNVIDIA:mainfrom
CarlosGomes98:cgomes/ds_fuse_dLN_add
Mar 9, 2026
Merged

Fused dLN + add in backwards pass#3384
asolergi-nv merged 32 commits intoNVIDIA:mainfrom
CarlosGomes98:cgomes/ds_fuse_dLN_add

Conversation

@CarlosGomes98
Copy link
Copy Markdown
Contributor

@CarlosGomes98 CarlosGomes98 commented Feb 12, 2026

What does this PR do ?

dev branch: #3383
Fuse the backwards pass of the residual fork + LN pattern in transformer_layer.py.

Contribution process

flowchart LR
    A[Pre-checks] --> B[PR Tests]
    subgraph Code Review/Approval
        C1[Expert Review] --> C2[Final Review]
    end
    B --> C1
    C2 --> D[Merge]
Loading

Pre-checks

  • I want this PR in a versioned release and have added the appropriate Milestone (e.g., Core 0.8)
  • I have added relevant unit tests
  • I have added relevant functional tests
  • I have added proper typing to my code Typing guidelines
  • I have added relevant documentation
  • I have run the autoformatter.sh on my PR

Code review

The following process is enforced via the CODEOWNERS file for changes into megatron/core. For changes outside of megatron/core, it is up to the PR author whether or not to tag the Final Reviewer team.

For MRs into `main` branch

Feel free to message or comment the @mcore-oncall to help accelerate your merge into main. The less complex your PR is, the faster it will be approved and merged!

(Step 1): Add PR label Expert Review

(Step 2): Collect the expert reviewers reviews

  1. Attach the Expert Review label when your PR is ready for review.
  2. GitHub auto-assigns expert reviewers based on your changes. They will get notified and pick up your PR soon.

⚠️ Only proceed to the next step once all reviewers have approved, merge-conflict are resolved and the CI is passing.
Final Review might get declined if these requirements are not fulfilled.

(Step 3): Final Review

  1. Add Final Review label
  2. GitHub auto-assigns final reviewers based on your changes. They will get notified and pick up your PR soon.

(Optional Step 4): Cherry-pick into release branch

If this PR also needs to be merged into core_r* release branches, after this PR has been merged, select Cherry-pick to open a new PR into the release branch.

For MRs into `dev` branch The proposed review process for `dev` branch is under active discussion.

MRs are mergable after one approval by either eharper@nvidia.com or zijiey@nvidia.com.

Merging your PR

Any member of core-adlr and core-nemo will be able to merge your PR.

@CarlosGomes98 CarlosGomes98 requested review from a team as code owners February 12, 2026 11:21
@copy-pr-bot
Copy link
Copy Markdown

copy-pr-bot Bot commented Feb 12, 2026

This pull request requires additional validation before any workflows can run on NVIDIA's runners.

Pull request vetters can view their responsibilities here.

Contributors can view more details about this message here.

@ko3n1g ko3n1g requested a review from a team February 12, 2026 11:21
@erhoo82 erhoo82 added dev2main: mbridge dev to main: this PR is needed in main for mbridge Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. module: moe labels Feb 12, 2026
@chtruong814 chtruong814 added the needs-follow-up Issue needs follow-up label Feb 14, 2026
@Phlip79 Phlip79 added Final Review PR is in the "final review" stage and removed Expert Review [deprecated] Apply this label to indicate that your PR is ready for expert review. labels Feb 17, 2026
@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Feb 17, 2026

/ok to test 9125dc1

@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Feb 20, 2026

/ok to test 0440941

@CarlosGomes98
Copy link
Copy Markdown
Contributor Author

CarlosGomes98 commented Feb 23, 2026

Refactored to guard the has_residual flag behind a check for TENorm. I chose this approach as a similar pattern exists in the MLP fusion. However, I wonder if these decisions instead belong at the submodule picking level, so as not to keep cluttering the transformer_layer.

@CarlosGomes98 CarlosGomes98 force-pushed the cgomes/ds_fuse_dLN_add branch 2 times, most recently from f4219a4 to aa59424 Compare February 23, 2026 13:38
@ko3n1g
Copy link
Copy Markdown
Contributor

ko3n1g commented Feb 23, 2026

/ok to test aa59424

@CarlosGomes98
Copy link
Copy Markdown
Contributor Author

/ok to test 3fcb878

@ericharper
Copy link
Copy Markdown
Contributor

/claude review

Comment thread tests/unit_tests/fusions/test_rmsnorm_residual_fusion.py
Comment thread megatron/core/transformer/transformer_config.py
Comment thread megatron/core/extensions/transformer_engine.py
Comment thread tests/unit_tests/fusions/test_rmsnorm_residual_fusion.py
ericharper and others added 2 commits March 5, 2026 15:32
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
@ericharper
Copy link
Copy Markdown
Contributor

/claude review

Comment thread megatron/core/transformer/transformer_config.py Outdated
Co-authored-by: claude[bot] <209825114+claude[bot]@users.noreply.github.com>
@ericharper
Copy link
Copy Markdown
Contributor

/ok to test bd7395a

@Phlip79
Copy link
Copy Markdown
Member

Phlip79 commented Mar 9, 2026

/ok to test 3d5501a

@svcnvidia-nemo-ci
Copy link
Copy Markdown

🔄 Merge queue validation started!

You can track the progress here: https://github.com/NVIDIA/Megatron-LM/actions/runs/22835292757

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Approved All necessary approvals have been made complexity: medium dev2main: mbridge dev to main: this PR is needed in main for mbridge module: moe Run tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

9 participants