Skip to content

Run neutrino filter from XML files.#292

Open
schuetha wants to merge 5 commits intoSND-LHC:masterfrom
schuetha:run_xml
Open

Run neutrino filter from XML files.#292
schuetha wants to merge 5 commits intoSND-LHC:masterfrom
schuetha:run_xml

Conversation

@schuetha
Copy link

Add scripts that can read the XML files and run the neutrino events selection cut based on the events list in the XML file.

if [ -z ${SNDSW_ROOT+x} ]
then
echo "Setting up SNDSW"
export ALIBUILD_WORK_DIR=/afs/cern.ch/user/s/schuetha/work/public/data_work_flow/sw
Copy link

@siilieva siilieva Oct 29, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no hard-coding of specific user dirs please

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@siilieva What I can think of is to let the user input their ALIBUILD_WORK_DIR into the shell script. Do you have any suggestions?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The scripts @eduard322 prepared for running MC productions have a separate config.sh file where this sort of user-related environment is set up. Maybe we could use a similar strategy here? @schuetha, you could commit an example config.sh.example script with your own hard-coded paths and with some comments with instructions on what to update.
See @eduard322's script here:
https://github.com/eduard322/snd_dag/blob/main/config.sh

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi, since you are discussing here the unified way of launching htcondor events, we can prepare a unified repo (or folder) for the most frequently simulated events with scripts that can be used almost out-of-the box. What do you think? I have updated now the above mentioned repo, so that now it has instructions, argparser for choosing the simulation parameter and can generate automatically README file in the simulation folder with the parameters that had been used for this dataset. It is based on HTCondor DAG, not the perfect solution, but now works more or less stable.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a good idea. I started preparing some scripts for running the event selection code based on your DAG scripts.
I'm doing this in a fork of your repo. Still work in progress: https://github.com/cvilelahep/snd_dag/blob/main/run_submit_data_proc.py

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would you agree it is best to have that unified repo under the SNDLHC organization? A thumb up is enough.
I'd propose to add the MC coordinators as admins of it, so they can approve PRs and handle it fully.
We only need a general structure to control its growth a bit.
Like, we can have as a start subdirs for different job cases e.g. simulation, analysis, electronic det data management(e.g. track reconstruction, data reprocessing) and emulsion data management (alignment, reconstruction(track, vertices) etc).
Also we need to make sure we adopt the info from the nusim_automation, I believe it was the basis for other cases developed by individuals.
We can discuss the setup on SoftPhys meeting tomorrow. I will add a separate slot.
Perhaps you realized reading so far, I am on board for this idea :D

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I added significant changes into the pipeline: https://github.com/eduard322/snd_dag. Now the simulation parameters can be set via flags or via yaml file in simulation_configs folder. Also I added the script that automtically creates gdml file via shipLHC/makeGeoFile.py. README now describes the whole process of the simulation and how to launch. To launch any kind of simulation, user needs just to specify the sw folder and the output folder. Feedback and suggestions are welcome!

Copy link

@siilieva siilieva left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@schuetha , this PR should go after the one producing the XML #284.
And also, I will squash the commits as we have addition and removal of the files a few times.
I give my other comments in line

path=$(xmllint --xpath "string(/runlist/runs/run[$i]/path)" "${xml}")

if [[ "$year" == "2022" || "$year" == "2023" ]]; then
geo_file="/eos/experiment/sndlhc/legacy_geofiles/2023/geofile_sndlhc_TI18_V4_2023.root"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why reading the legacy geo files and not the updated ones?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was ran some checks on this. The legacy geo file give slightly different DS track reconstruction results compared to using the geo files on the physics directories.

In particular, event number 23622206 in run 5159 is selected with the legacy geofile but it fails the cut requiring that the DS track intercepts the z=300 cm plane within 5 cm of the SciFi edges if the physics directory geofile is used for the track reconstruction.

@siilieva, do we expect legacy geofiles to result in different DS reconstruction compared to the ones in the physics dir?

Copy link

@cvilelahep cvilelahep Nov 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As a more general solution to this problem, maybe we could use the geometry file path getter in tools?
https://github.com/SND-LHC/sndsw/blob/master/analysis/tools/sndGeometryGetter.h#L12C21-L12C31

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

5159

short answer: yes if using the /eos/experiment/sndlhc/legacy_geofiles/2023/geofile_sndlhc_TI18_V4_2023.root and not the /eos/experiment/sndlhc/legacy_geofiles/2023/geofile_sndlhc_TI18_V3_2023.root
The legacy file geofile_sndlhc_TI18_V4_2023.root was done in Feb 2024 for the 2023 ion run, for which we had issues with low-efficiency in tracking. So the MuFilter spatial alignment was redone for this target run specifically as a measure to solve it(Ivo's report 13th Feb 2024). In the end legacy 2023 V4 has little differences wrt legacy 2023 V3. Little means below the DS res of 0.33cm, but that can be huge for an extrapolation to z=300cm. We ended up separating the geos for 2022 and 2023 to have this fully sorted.
As reported on the June 2025 CM slide 15 here the new geofiles, ones stored in /physics, showed good behaviour when redong all det alignments over the 2022-2023 runs.
Also, trying the 2dED run 5159, partition 23, event 622 206 using the
latest 2022 geo, left plot: /eos/experiment/sndlhc/convertedData/physics/2022/geofile_sndlhc_TI18_V4_2022.root
and the
legacy geo, right plot: /eos/experiment/sndlhc/legacy_geofiles/2023/geofile_sndlhc_TI18_V4_2023.root
it seems the track passes the criterion for a 5-cm distance at z=300cm with very little though so it is surely a close-mis when using the proper track extrapolation and not the 2dED fast solution
Screenshot from 2025-11-11 10-53-31

if [ -z ${SNDSW_ROOT+x} ]
then
echo "Setting up SNDSW"
export ALIBUILD_WORK_DIR=/afs/cern.ch/user/s/schuetha/work/public/data_work_flow/sw

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no private dirs please

@cvilelahep
Copy link

If I understand correctly, this script processes all the runs in the xml file, is that correct @schuetha ?

In practice, we'll need to be able to run in parallel on HTCondor, with groups of runs split between different processes. I have some python scripts to do this (but not using the xml). I can try adding them to this branch and then we can try adapting them to the xml format.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants