-
-
Notifications
You must be signed in to change notification settings - Fork 1.5k
MRG: Unify _prepare_gain #5984
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
MRG: Unify _prepare_gain #5984
Conversation
Codecov Report
@@ Coverage Diff @@
## master #5984 +/- ##
=========================================
Coverage ? 88.83%
=========================================
Files ? 401
Lines ? 72880
Branches ? 12180
=========================================
Hits ? 64742
Misses ? 5218
Partials ? 2920 |
2a18786 to
fed164b
Compare
|
@agramfort okay with you for a backward compat break to refactor If you want this refactoring/deprecation to appear in this PR, let me know. Otherwise this can be merged and the |
| @pytest.mark.parametrize('method, lower, upper, kwargs, depth', [ | ||
| ('MNE', 35, 40, {}, dict(limit_depth_chs=False)), | ||
| ('MNE', 45, 55, {}, 0.8), | ||
| ('MNE', 65, 70, {}, dict(limit_depth_chs='whiten')), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
FYI this shows the performance improvement for MNE when using whiten-based depth weighting (at least in this case, with this measure for localization bias)
|
@agramfort <https://github.com/agramfort> okay with you for a backward
compat break to refactor compute_depth_prior to take forward instead of gain,
gain_info, is_fixed_orient, patch_areas? Given that this function wasn't
in the doc and buried in mne.forward.forward.compute_depth_prior I think
it's probably fine, esp. if we note it well in the doc. Or should we do a
proper deprecation cycle? We could allow gain to be an instance of forward,
and have the other options be None and removed in 0.19. Let me know which
you prefer.
If you want this refactoring/deprecation to appear in this PR, let me
know. Otherwise this can be merged and the compute_depth_prior cleanups
can be the next PR.
if this is still relevant I let you judge. Maybe in a next PR if this is
pretty much ready.
|
Okay @agramfort I had to change a tolerance for one gamma_map test. All other tests should be passing except
mne/inverse_sparse/tests/test_mxne_inverse.py:test_mxne_inverse_standard:Then the
loose_method='sum'should change toloose_method='svd', but this will break the sphere test. (The orientation only matches to 0.89, not 0.99, and the dipole fit is not 1.7mm away but 30mm :( ).I'm out of my depth working on the MxNE stuff, so hopefully you find some hacking time soon.
Todo:
limit_depth_chs='whiten'to MxNE and MNEtest_mxne_inverse_standardsetloose_method='svd'and fix the failing sphere model testNext PR:
compute_depth_priorto operate onforward