fix(dpmodel): fix natoms[0] bug, einsum, and return type in EnergyLoss#5325
fix(dpmodel): fix natoms[0] bug, einsum, and return type in EnergyLoss#5325wanghan-iapcm wants to merge 1 commit intodeepmodeling:masterfrom
Conversation
- Fix natoms[0] -> natoms in generalized force branch (natoms is int) - Replace xp.einsum with array-API-compatible xp.sum + broadcasting - Fix return type annotation of Loss.call and EnergyLoss.call from dict[str, Array] to tuple[Array, dict[str, Array]] - Add TestEnerGF consistency test for generalized force code path - Add dpmodel-level unit tests for EnergyLoss (basic, aecoeff, generalized force, huber, serialize round-trip)
📝 WalkthroughWalkthroughThe PR updates the Loss and EnergyLoss class signatures to return a tuple of (loss, auxiliary losses dictionary) instead of a single dictionary, and fixes array indexing and reshaping operations in generalized-force computations to align with array-api-compat semantics. Comprehensive tests validate the updated behavior across multiple backends. Changes
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~20 minutes Possibly related issues
Possibly related PRs
Suggested labels
Suggested reviewers
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 3
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@source/tests/common/dpmodel/test_loss_ener.py`:
- Around line 89-91: The test currently unpacks (loss, more_loss) from
loss_fn.call but never uses more_loss which triggers RUF059; update the test in
test_loss_ener.py to either assert something lightweight about more_loss (e.g.,
assertIsNotNone(more_loss) or assertGreaterEqual(len(more_loss), 0)) or
explicitly discard it using an underscore (e.g., _, = ...) so the tuple contract
is honored; locate the unpack at the call to loss_fn.call and make the change to
reference or discard the more_loss variable accordingly.
- Around line 143-145: The test currently unpacks loss, more_loss =
loss_fn.call(1.0, natoms, model_dict, label_dict) but never uses more_loss
(RUF059); fix by either explicitly discarding it (e.g., replace with loss, _ =
loss_fn.call(...)) or add a minimal assertion/type check to use it (e.g.,
self.assertIsNotNone(more_loss) or self.assertIsInstance(more_loss, dict)) so
the return shape is validated; modify the call site in the test where
loss_fn.call is invoked to implement one of these options.
In `@source/tests/consistent/loss/test_ener.py`:
- Around line 373-379: Unpack only the used loss from the obj.build call by
discarding the unused TensorFlow auxiliary binding: change the second variable
name from more_loss to _ or _more_loss in the assignment where obj.build(...) is
called (the invocation of obj.build that currently assigns loss, more_loss) so
the unused TF loss is ignored and RUF059 is avoided; run ruff check . and ruff
format . after the change.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository UI
Review profile: CHILL
Plan: Pro
Run ID: 14d402a1-8899-4080-b0ca-971e9789866b
📒 Files selected for processing (4)
deepmd/dpmodel/loss/ener.pydeepmd/dpmodel/loss/loss.pysource/tests/common/dpmodel/test_loss_ener.pysource/tests/consistent/loss/test_ener.py
| model_dict, label_dict, natoms = self._make_data() | ||
| loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict) | ||
| self.assertIsNotNone(loss) |
There was a problem hiding this comment.
Use more_loss here or discard it explicitly.
This unpack is currently unused and Ruff reports RUF059. A lightweight assertion keeps the tuple-contract coverage and satisfies the linter.
♻️ Suggested change
model_dict, label_dict, natoms = self._make_data()
loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict)
self.assertIsNotNone(loss)
+ self.assertIsInstance(more_loss, dict)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| model_dict, label_dict, natoms = self._make_data() | |
| loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict) | |
| self.assertIsNotNone(loss) | |
| model_dict, label_dict, natoms = self._make_data() | |
| loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict) | |
| self.assertIsNotNone(loss) | |
| self.assertIsInstance(more_loss, dict) |
🧰 Tools
🪛 Ruff (0.15.6)
[warning] 90-90: Unpacked variable more_loss is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@source/tests/common/dpmodel/test_loss_ener.py` around lines 89 - 91, The test
currently unpacks (loss, more_loss) from loss_fn.call but never uses more_loss
which triggers RUF059; update the test in test_loss_ener.py to either assert
something lightweight about more_loss (e.g., assertIsNotNone(more_loss) or
assertGreaterEqual(len(more_loss), 0)) or explicitly discard it using an
underscore (e.g., _, = ...) so the tuple contract is honored; locate the unpack
at the call to loss_fn.call and make the change to reference or discard the
more_loss variable accordingly.
| model_dict, label_dict, natoms = self._make_data() | ||
| loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict) | ||
| self.assertIsNotNone(loss) |
There was a problem hiding this comment.
Use more_loss here or discard it explicitly.
This unpack is currently unused and Ruff reports RUF059. A small type assertion is enough if you want to keep validating the new return shape in this test too.
♻️ Suggested change
model_dict, label_dict, natoms = self._make_data()
loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict)
self.assertIsNotNone(loss)
+ self.assertIsInstance(more_loss, dict)📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| model_dict, label_dict, natoms = self._make_data() | |
| loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict) | |
| self.assertIsNotNone(loss) | |
| model_dict, label_dict, natoms = self._make_data() | |
| loss, more_loss = loss_fn.call(1.0, natoms, model_dict, label_dict) | |
| self.assertIsNotNone(loss) | |
| self.assertIsInstance(more_loss, dict) |
🧰 Tools
🪛 Ruff (0.15.6)
[warning] 144-144: Unpacked variable more_loss is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@source/tests/common/dpmodel/test_loss_ener.py` around lines 143 - 145, The
test currently unpacks loss, more_loss = loss_fn.call(1.0, natoms, model_dict,
label_dict) but never uses more_loss (RUF059); fix by either explicitly
discarding it (e.g., replace with loss, _ = loss_fn.call(...)) or add a minimal
assertion/type check to use it (e.g., self.assertIsNotNone(more_loss) or
self.assertIsInstance(more_loss, dict)) so the return shape is validated; modify
the call site in the test where loss_fn.call is invoked to implement one of
these options.
| loss, more_loss = obj.build( | ||
| self.learning_rate, | ||
| [self.natoms], | ||
| predict, | ||
| label, | ||
| suffix=suffix, | ||
| ) |
There was a problem hiding this comment.
Discard the unused TF more_loss binding.
build_tf() drops the auxiliary losses immediately, so this unpack just triggers RUF059. Rename it to _ or _more_loss.
♻️ Suggested change
- loss, more_loss = obj.build(
+ loss, _ = obj.build(
self.learning_rate,
[self.natoms],
predict,
label,
suffix=suffix,🧰 Tools
🪛 Ruff (0.15.6)
[warning] 373-373: Unpacked variable more_loss is never used
Prefix it with an underscore or any other dummy variable pattern
(RUF059)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@source/tests/consistent/loss/test_ener.py` around lines 373 - 379, Unpack
only the used loss from the obj.build call by discarding the unused TensorFlow
auxiliary binding: change the second variable name from more_loss to _ or
_more_loss in the assignment where obj.build(...) is called (the invocation of
obj.build that currently assigns loss, more_loss) so the unused TF loss is
ignored and RUF059 is avoided; run ruff check . and ruff format . after the
change.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #5325 +/- ##
==========================================
- Coverage 82.32% 82.31% -0.01%
==========================================
Files 768 775 +7
Lines 77098 77627 +529
Branches 3659 3675 +16
==========================================
+ Hits 63469 63899 +430
- Misses 12458 12554 +96
- Partials 1171 1174 +3 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Summary by CodeRabbit
Bug Fixes
Tests