Remove libNNPDF from python#1680
Conversation
|
Greetings from your nice fit 🤖 !
Check the report carefully, and please buy me a ☕ , or better, a GPU 😉! |
…ed, only the commondata is loaded through load_commondata()
…nto unlink_libNNPDF_fromPython
|
The only thing left in this PR is to add some tests to substitute the ones I've removed (because the test was basically a cross-check with the C++ version). I guess make them into regression tests is the right thing to do (because at the end of the day it was what they were). But suggestions welcome. |
|
Special thanks to @comane and @andreab1997 who made this possible with their work on closure tests and th covmat! |
|
Thanks for this. The diff looks nice and obvious, which is good. @comane maybe you would like to poke around this a little bit (e.g. try to see if there is something that obviously crashes). Someone should go over the docs as well; I would guess some changes would be needed, but haven't checked. |
Looks like a good place to start. |
|
Greetings from your nice fit 🤖 !
Check the report carefully, and please buy me a ☕ , or better, a GPU 😉! |
RoyStegeman
left a comment
There was a problem hiding this comment.
Thanks, the changes look fine. There are still a few mentions of cpp in the docstrings that can be removed.
|
|
||
| def run(self): | ||
| if sys.version_info < (3, 6): | ||
| if sys.version_info < (3, 9): |
There was a problem hiding this comment.
Did we officially drop 3.8 as well?
There was a problem hiding this comment.
Yes. We are no longer building conda packages for 3.8.
There was a problem hiding this comment.
Ok, I wonder if we should drop also from the other projects...
Point is that I'm fully happy in dropping 3.7, that will go EOL in June (now that I'd like to support 3.11, I don't feel guilty dropping an almost EOL version, even if it's the one running on galileo), but I'm not as sure for 3.8.
In particular, I believe @cschwan for example is still using it (if I remember correctly). Most likely he doesn't care too much about installing vp, but it makes me think that some still very relevant versions of major distributions are still on it.
There was a problem hiding this comment.
Yes, some are. I still have 3.8 in a few places. And if you really want to use it you can just install n3fit/validphys manually and it will work. But since we are not supporting it anymore (or testing for it at all) in "the official conda package" we should warn people.
| for dataset in group.datasets: | ||
| ds = dataset.load() | ||
| Ndata = ds.GetNData() | ||
| Ndata = dataset.load_commondata().ndata |
There was a problem hiding this comment.
I know it is irrelevant, but is there a reason not to follow the usual naming conventions?
In any case, this is really beyond the scope of this PR...
alecandido
left a comment
There was a problem hiding this comment.
I had a look at all the diffs, and they look like a careful use of grep for a few specific keywords :D
However, to me, it is perfect. I'm also amazed that you have been able to change code in several places while updating very little tests, so the code changed/dropped was either so well-tested to be implementation independent, or just untested.
In both cases, there is a good reason to be confident in the modifications :)
Indeed. This is also an answer your question on why I left some variables with "old convention names". That would've required extra greps/searches (or looking two lines below). Too much effort for my old tired hands. |
comane
left a comment
There was a problem hiding this comment.
Seems good to me.
None of the modifications done in validphys.core.py affects the closure tests (I have explicitly checked this).
This PR removes all
libNNPDFimports from python. The compilation still happens (not planning to remove that in this PR) so one can still doimport NNPDFif needed.I've tested the examples in the
validphys/examplesfolder and they seem to work ok.The main thing this PR actually does is to remove the
loadfunctions which in the past were loading the full dataset (fktables, cuts, commondata) just to do things like getting the number of points because that's what C++ asked for. I've changed these forload_commondatafunctions which seem to be enough everywhere (the only information requested by the calling function was whatever was contained in the commondata). Since the commondata is now pure python, no C++ is needed anymore.