-
Notifications
You must be signed in to change notification settings - Fork 13
Make the photon a downloadable resource #2393
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
|
Hi @scarlehoff , can you please start having a look? I'm pretty sure this needs to be tweaked a bit more, but it's a good starting point. I tried using a local photon set, generated using the new script
|
Let's use the Milan server for this for now: |
scarlehoff
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I've only had a look at the download. The parallelization I need more time to be able to give meaningful comments.
Did you already tested and saw that 1) it was faster 2) it didn't make the memory explode 3) the results are correct (god knows whether fiatlux is threadsafe)
|
I think the parallelization is ok as well (provided it runs, I didn't test it). I think for now (before testing the download online) just make it so the download is a copy from some folder in your computer to the share folder. If that works the download from the server is a technicality and will work just the same. I think this will already ease quite a bit the QED fits (e.g., all the positivity checks that are being done now could be done with QED for free thanks to this for the price of one single computation of the photon!) |
|
Please, have a look at this training report (link). In the new fit, I downloaded the photon previously computed with the script One thing that is left is whether we want to move |
701d4b3 to
58d2a61
Compare
scarlehoff
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this. I've tried the light photons from the tests and seems to do ok. I'll try with some heavier photon, I'm curious about how heavy the eko will be in that case.
1a82179 to
66b6056
Compare
e55a371 to
4b2e29d
Compare
scarlehoff
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy with the changes, just a few comments about the wording.
And a big comment about the docs! Could you please update the docs to include this functionality?
At the very least, explain it (and how to use it and where int he server are the photons) at the tutorial page https://docs.nnpdf.science/tutorials/run-qed-fit.html
Crucially it should mention:
- What will happen when you use
compute_in_setupfit - In which situations it makes sense to have it (best effort)
- That the number of cores to use is the one in the runcard. FWIW, 64 cores is using 64 GB of RAM, perhaps this is also worth mentioning.
And any other info that you deem important.
I'm running a test 4.0-like QED fit with this branch. I don't expect to see any problems but probably will have finished by tomorrow so we can merge as soon as the docs are added if that doesn't crash smh
1e9f314 to
fab60cd
Compare
|
I've added the documentation. Please, have a look and let me know. |
scarlehoff
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You went too specific :'( but ok for the rest
Look, I've submitted a job to create the photon for NNPDF40_nnlo_as_01180_qed I'll let you know when it is done.
| fiatlux: | ||
| luxset: NNPDF40_nnlo_as_01180 | ||
| luxset: 251127-jcm-lh-qed-001 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| luxset: 251127-jcm-lh-qed-001 | |
| luxset: NNPDF40_nnlo_as_01180_qcd |
I only added that for testing.
| for precomputed photon PDF sets using the pair of ``luxset`` and ``theoryid`` | ||
| parameters, first locally and then on the NNPDF server (see | ||
| :ref:`photon-resources` for details). The photon PDF set will be named | ||
| ``photon_theoryID_<theoryid>_fit_<luxset>``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't the compute in setupfit flag also feature here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, maybe. This part focuses only on the creation of the runcard. The compute_in_setupfit flag is needed if the user wants to compute the photon from scratch. But this is required only if the photon is not already stored in the remote. That's why I opted for the following reasoning order:
Setting up the runcard --> check and collect available resources --> Compute missing resources.
Maybe the wording is a bit misleading at this stage of the tutorial because the user is not yet aware of the possibility of either downloading/storing available resources or computing resources from scratch. Maybe something like that, which introduces the two possibilities, sounds better:
The code uses both the ``fiatlux`` block and the ``theoryid`` specified in the
runcard to identify the photon PDF set, which will be assigned the name
``photon_theoryID_<theoryid>_fit_<luxset>``. This name will be used to either
look for existing photon PDF sets or to store newly generated ones, as explained
below.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But it is a flag for the dictionary fiatlux.
| [INFO]: Downloading https://data.nnpdf.science/photons/photon_theoryID_702_fit_251127-jcm-lh-qed-002.tar. | ||
| [==================================================] (100%) | ||
| [INFO]: Extracting archive to /user/envs/nnpdf/share/NNPDF/photons_qed | ||
| PosixPath('/user/envs/nnpdf/share/NNPDF/photons_qed/photon_theoryID_702_fit_251127-jcm-lh-qed-002') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you want to be this specific, create a photon for NNPDF40_nnlo_as_01180_qed and theoryid 40000100
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the context of a tutorial, I find this level of detail quite suited. If I were an external user, I'd appreciate this one-to-one approach. I agree that if we want to keep this approach, we need existing sets. What do you think?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm happy with the level of detail. I am less happy about having to be the one creating the sets for the level of detail to apply 🤷♂️
| [WARNING]: Using q2min from runcard | ||
| [WARNING]: Using w2min from runcard | ||
| Using Keras backend | ||
| [INFO]: All requirements processed and checked successfully. Executing actions. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comments as before.
| Automatic upload to the NNPDF server through ``vp-upload`` is **not** | ||
| supported at the moment. The user should manually create a ``tar`` archive | ||
| file containing the photon replicas and upload it to server. Refer to the | ||
| :ref:`profile file <nnprofile>` to find the remote URL where photon PDF sets are stored. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few comments here
- you should mention explicitly the command to run for the
.tarbecause it is not the same to do
tar -cf my_photon.tar my_photon_folder
or
cd my_photon_folder
tar -cf my_photon.tar *.npz
or even
cd my_photon_folder
tar -czf my_photon.tgz *.npz
- In the profile you have the URL but you didn't add a
photon_target_dirso it is impossible to know where in the server the photon lives!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Btw, why is it not supported? You already have all the infrastructure so it should be easy.
We probably already talked about this but you can think about me like an LLM with about 30 words of memory.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Btw, why is it not supported? You already have all the infrastructure, so it should be easy.
We probably already talked about this, but you can think about me like an LLM with about 30 words of memory.
I don't remember the reason, but I'm pretty sure that at some point we agreed on leaving this completely manual because it wasn't a priority :/ But with hindsight, I think at this stage we might want to allow for uploading photon resources using vp-upload, which eases the process by quite a lot.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm pretty sure that at some point we agreed on leaving this completely manual because it wasn't a priority
I agree with my past self and you it isn't, we can do it at a later stage.
|
The new more official and cooler photon is ready in the server: edit: FWIW, while I think it would be better to have it in (and, not seen in the blue, you remove the possibility of being unlucky and having some replicas that won't converge) |

This PR addresses #2214