Skip to content

Conversation

@rob-luke
Copy link
Member

@rob-luke rob-luke commented Oct 23, 2019

What does this implement/fix?

This PR adds the ability to convert NIRS optical density data (see #6827) to changes in oxy- and deoxy- haemoglobin concentration.

@codecov
Copy link

codecov bot commented Oct 23, 2019

Codecov Report

Merging #6975 into master will increase coverage by <.01%.
The diff coverage is 98.16%.

@@            Coverage Diff             @@
##           master    #6975      +/-   ##
==========================================
+ Coverage    89.7%   89.71%   +<.01%     
==========================================
  Files         430      432       +2     
  Lines       77024    77122      +98     
  Branches    12531    12537       +6     
==========================================
+ Hits        69097    69191      +94     
- Misses       5130     5133       +3     
- Partials     2797     2798       +1

@rob-luke
Copy link
Member Author

@larsoner I need to include some tabulated data to complete this function. What is the best way to include data in MNE?

The data I will include comes from here https://omlc.org/spectra/hemoglobin/summary.html

Homer has manually written the data in a script (link) and nirs-toolbox includes a .mat file (link). I was thinking to add the data to the repo as a csv. Is there a better data format?

@agramfort
Copy link
Member

agramfort commented Oct 23, 2019 via email

@larsoner
Copy link
Member

spectra.mat | 16.79 KB

This appears to be usable with scipy.io.loadmat. I would just use that file (if we have permission) or generate our own using scipy.io.savemat(fname, dict, use_compression=True) (if we don't have permission).

I tried saving values with mne.externals.h5io.write_hdf5 and they ended up being ~39 kB instead of ~17 kB. Load-save round-trip with loadmat/savemat ended up staying at ~17 kB.

@agramfort
Copy link
Member

agramfort commented Oct 23, 2019 via email

@rob-luke
Copy link
Member Author

Thank you both. I will generate a compressed mat file from the original source as I have not got permission from the other projects.

@rob-luke
Copy link
Member Author

I am trying to wrap my head around the units that data should be used in MNE.

The units listed for hbo and hbr in defaults are currently uM (which is how fNIRS is usually reported). https://github.com/mne-tools/mne-python/blob/master/mne/defaults.py#L17

Does that mean the raw data should be in this uM? Or should the data be stored in M and the scalings is used to put the plots in to uM? In this case should the scaling be 1e-6 (not 1e6)?

assert 'fnirs_raw' not in raw
assert 'fnirs_od' not in raw
assert 'hbo' in raw
assert 'hbr' in raw
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And eventually it would be great for you to convert the two existing test files to hemoglobin using some other tool, add the result to the mne-testing-data dataset, and assert_allclose here

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for the slow response, yes something like this where hopefully you end up at np.allclose with some suitable tolerance

@larsoner
Copy link
Member

The units listed for hbo and hbr in defaults are currently uM (which is how fNIRS is usually reported). https://github.com/mne-tools/mne-python/blob/master/mne/defaults.py#L17

Data should be stored in SI units, so M. And the scaling is basically inverted, i.e. what you'd need to multiply the original values (M) by to get to the ones desired for plotting (uM).

In MEG (magnetometer/mag) data are stored in T, plotted in fT, so the scale is 1e15. Similarly EEG is in V, plotted in uV, so the scale is 1e6.

Make sense?

@rob-luke
Copy link
Member Author

Make sense?

Yes makes perfect sense, thanks for explaining.

@rob-luke
Copy link
Member Author

When I am back in the lab I will finish that dataset I keep talking about. But in the meantime here as a quick demo of fNIRS analysis based on this PR.... https://github.com/rob-luke/mne-fnirs-demo/blob/master/Tapping.ipynb

@agramfort
Copy link
Member

agramfort commented Oct 27, 2019 via email

@rob-luke
Copy link
Member Author

There is some difference in scaling between the MATLAB converted data and the MNE converted data. I did some investigation and the difference mainly comes from how the distance between sensors is calculated, we calculate the distance in three dimensions, matlab calculates the distance from a 2d projection. So I think the difference is acceptable.

Some figures showing the converted data from both MNE and MATLAB can be seen here... https://github.com/rob-luke/mne-fnirs-demo/blob/master/BeerLambert_Validation.ipynb the last section down the bottom is the exact data used in this PR

@larsoner
Copy link
Member

Any idea why they do the projection? It seems like an unnecessary step, unless they store locations in the projected 2D space.

@rob-luke
Copy link
Member Author

Any idea why they do the projection? It seems like an unnecessary step, unless they store locations in the projected 2D space.

That's exactly why they do it 😄 . The NIRStar software saves a .mat file and that contains separately the 2d and 3d positions. We are using the 3d positions, the other software takes the 2d positions. i think the 3d positions were only added since version 15 of the software.

The scaling of signals varies very differently between the different toolboxes, and the parameters used for conversion always have to be reported in your manuscript. The main difference is usually the assumed partial pathlength factor which is meant to take in to account things like skull thickness, optical properties of the skin etc etc (it is 0.1 in nirstoolbox, and 6 in homer). All these things vary with age and the wavelength of the light used (see General equation for the differential pathlength factor of the frontal human head depending on wavelength and age). In the future I would like to implement the equation from that paper and have the ppf adjusted based on the age stored in the mne info structure, but thats not done in any tools that i have seen before.

@rob-luke
Copy link
Member Author

Also the reason I have made the matlab comparison to the 15.0 test data was because that has a nice one-one mapping of the source detector pairs, whereas the other test data montages are more complicated and the order in which MNE stores the data doesnt one-to-one match with MATLAB, so it would have been messier with reindexing to match them. In hindsight I probably should have made the other montages simpler.

@rob-luke rob-luke changed the title WIP: Beer Lambert Law MRG: Beer Lambert Law Oct 30, 2019
@agramfort
Copy link
Member

@larsoner merge if you're happy

@larsoner larsoner merged commit a76fb44 into mne-tools:master Oct 30, 2019
@larsoner
Copy link
Member

Thanks @rob-luke !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants