-
Notifications
You must be signed in to change notification settings - Fork 13
Change ThPredictions to python predictions #1430
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Overall looks fine, but would like to go over this slowly when I have time. As an aside, we should change how the providers in results.py work at some point later: there should be separate actions for getting data and theory predictions (where theory predictions don't need e.g. a covmat) and these should be combined in the |
siranipour
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. I'm yet to play with it, so will get round to this soon. I'm a bit confused at the need for taking central_value as an explicit argument though
No rush (no infinite tranquility either, but no rush). For the |
|
Regarding the fact of doing central predictions and the envelope separately: this can be easily implemented but it needs to be done in #1433 since I can add a flag to |
4e8c31f to
0bad6c1
Compare
|
The Sorry if I'm being pushy but I would like to work on this while I still remember what I was doing. |
Zaharid
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we need to look into how hessian sets are supposed to work here.
| all_centrals = [] | ||
| for d in datasets: | ||
| all_preds.append(predictions(d, pdf)) | ||
| all_centrals.append(central_predictions(d, pdf)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just a note to mostly myself that we are supposed to be rid of this in some subsequent PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All I can do is to promise I will work on this as soon as the dust is settled on the rest so that it doesn't get forgotten.
Maybe we should have hessian regression tests as well... |
|
Mm, by using stats_class the MC pdfs are broken instead. Edit: because libNNPDF uses the "pandas default" for |
003488d to
48d528b
Compare
|
In the last commit I've had to update the regression test but it passes for |
|
FWIW this is still using the cpp commondata, for e.g. |
|
I would want to make sure that nothing catastrophic happens with the vp report, but other than that it looks fine (given the various discussions on future directions). |
|
@scarlehoff the problem looks real. The relevant part being File "/usr/share/miniconda/envs/nnpdfenv/lib/python3.9/site-packages/validphys/results.py", line 549, in abs_chi2_data
chi2s = all_chi2(results)
File "/usr/share/miniconda/envs/nnpdfenv/lib/python3.9/site-packages/validphys/calcutils.py", line 73, in all_chi2
diffs = th_result._rawdata - data_result.central_value[:,np.newaxis]
IndexError: too many indices for array: array is 0-dimensional, but 1 were indexed |
Zaharid
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems to fail.
|
@scarlehoff this error https://github.com/NNPDF/nnpdf/runs/4324437054?check_suite_focus=true seems genuine (and not obvious to be what it is by looking quickly at it). |
|
Is the random NaN problem that I am unable to reproduce. I have a computer with Ubuntu looping on the test but it never fails... edit: sometimes I miss Travis' debug capabilities :P |
|
Greetings from your nice fit 🤖 !
Check the report carefully, and please buy me a ☕ , or better, a GPU 😉! |
Co-authored-by: Zaharid <zk261@cam.ac.uk>
Co-authored-by: Zaharid <zk261@cam.ac.uk>
35d845b to
ab2242d
Compare
|
Rebased. After a tortuous journey this should all pass and work fine now. |
|
Greetings from your nice fit 🤖 !
Check the report carefully, and please buy me a ☕ , or better, a GPU 😉! |
Zaharid
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have ran this with a few options, and also looked at it many times by now. Seems fine to me.
Closes #1347. In practice (for the user) nothing changes afaik, one can continue using ThPredictionResult.from_convolution but now it will use the python convolutions module instead. Everything works just the same, only a few regression test change but these changes are in the third digit (the annoying part is that the picture in test_plot changes also in a few pixels which is enough for the test to fail, so I had to redo that as well).
I added a few checks to convolutions.py but on second thought I don't know if they are very useful or at all.
Note: I'm not touching https://github.com/NNPDF/nnpdf/blob/master/validphys2/src/validphys/chi2grids.py since that module gets completely changed in #1424 and these two PR don't overlap for any other action.