Conversation
|
Hi @pp-mo so, there are some epicly large tolerances in here, required to get teh tests passing. It is worth comparing some results which
an env with numpy 1.11 (amongst others) a travis like env with numpy 1.10 amongst others those are numerically pretty different, but our current tests say those differences are all fine (see liberal use of I don't understand what in my dependency tree causes these calculation differences, but my PR hasn't changed the behaviour at all, merely altered the testing strategy |
I've been looking into this and it is definitely a numpy bug, and a tricksy one at that. The output in numpy 1.11 is ... ...and in numpy 1.10, both have the first value (which is the correct one). |
I'm chasing this up now. |
Not a bug as such, just a different calculation method in some cases, which gives different results -- though also definitely 'less correct' ones . |
replace the cmlApproxData pattern of comparing to array in the source tree (with massive file sizes) with a pattern to compare to array statistics instead