Skip to content

Add supports_gradient_checkpointing to NemotronHPreTrainedModel#45625

Merged
ArthurZucker merged 2 commits intohuggingface:mainfrom
sergiopaniego:nemotron-h-supports-gradient-checkpointing
Apr 27, 2026
Merged

Add supports_gradient_checkpointing to NemotronHPreTrainedModel#45625
ArthurZucker merged 2 commits intohuggingface:mainfrom
sergiopaniego:nemotron-h-supports-gradient-checkpointing

Conversation

@sergiopaniego
Copy link
Copy Markdown
Member

What does this PR do?

Enables gradient checkpointing for NemotronH by setting supports_gradient_checkpointing = True on NemotronHPreTrainedModel. The idea comes from its usage in TRL.

NemotronHBlock already inherits from GradientCheckpointingLayer, so the infrastructure at the block level is in place. The only missing piece was the class-level flag, which currently defaults to False (inherited from PreTrainedModel). As a result, any call to model.gradient_checkpointing_enable() (including the one issued by Trainer when gradient_checkpointing=True) raises:

ValueError: NemotronHForCausalLM does not support gradient checkpointing.

This is a simple omission, not a limitation of the architecture. All sibling hybrid Mamba/attention models in the library already enable it:

Model GradientCheckpointingLayer supports_gradient_checkpointing
Bamba
GraniteMoeHybrid
Zamba2
NemotronH ❌ (before this PR)

GraniteMoeHybrid is the closest analogue (MoE + hybrid Mamba/attention, same layout as NemotronH).

The change is made in modular_nemotron_h.py and propagated to modeling_nemotron_h.py via utils/modular_model_converter.py.

Fixes the failure seen downstream in huggingface/trl#5278, where NemotronH tests required a gradient_checkpointing=False workaround.

Fixes # (issue)

Code Agent Policy

The Transformers repo is currently being overwhelmed by a large number of PRs and issue comments written by
code agents. We are currently bottlenecked by our ability to review and respond to them. As a result,
we ask that new users do not submit pure code agent PRs at this time.
You may use code agents in drafting or to help you diagnose issues. We'd also ask autonomous "OpenClaw"-like agents
not to open any PRs or issues for the moment.

PRs that appear to be fully agent-written will probably be closed without review, and we may block users who do this
repeatedly or maliciously.

This is a rapidly-evolving situation that's causing significant shockwaves in the open-source community. As a result,
this policy is likely to be updated regularly in the near future. For more information, please read CONTRIBUTING.md.

  • I confirm that this is not a pure code agent PR.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline, Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link to it if that's the case.
  • Did you make sure to update the documentation with your changes? (n/a — class attribute only)
  • Did you write any new necessary tests? (n/a — covered by existing hybrid-model GC tests)

Who can review?

@ArthurZucker @Cyrilvallez

@HuggingFaceDocBuilderDev
Copy link
Copy Markdown

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@github-actions
Copy link
Copy Markdown
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: nemotron_h

Copy link
Copy Markdown
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TY

@ArthurZucker ArthurZucker added this pull request to the merge queue Apr 27, 2026
Merged via the queue into huggingface:main with commit ded2b74 Apr 27, 2026
21 checks passed
ArthurZucker pushed a commit that referenced this pull request Apr 28, 2026
…45625)

Add supports_gradient_checkpointing to NemotronHPreTrainedModel
@sergiopaniego sergiopaniego deleted the nemotron-h-supports-gradient-checkpointing branch April 29, 2026 14:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants