Skip to content

Conversation

@reinago
Copy link

@reinago reinago commented Nov 23, 2021

relates to the chat we had this morning, @FG-TUM

Description

Added some scripts that allow building ls1 into a singularity container. Singularity should allow use of low-level MPI regardless of the containerization. The idea is to improve testing and deployment both for ls1. This needs to be extended to optionally build MegaMol alongside ls1 for the same reasons.

Usage:

  • Install Singularity as per manual https://sylabs.io/guides/3.9/user-guide/quick_start.html
  • in tools/singularity:
    • Build a base system: sudo singularity build base.sif base-system.def Currently sudo seems to be required since the fakeroot produced some cpio issue when installing the filesystem package inside the image
    • Build ls1 on top of that system: singularity build --fakeroot ls1-megamol.sif ls1-megamol.def note: this works in user space
  • this gives you a working /tmp/ls1-mardyn/build/src/MarDyn inside the image tools/singularity/ls1-megamol.sif

using the mpirun from the outside host, I was, for example, able to compute the standard example in the "outside sources" using the executable inside the image:
mpirun -np 2 singularity exec ~/ls1-mardyn/tools/singularity/ls1-megamol.sif /tmp/ls1-mardyn/build/src/MarDyn config.xml --steps 10

TODOs:

  • add OSPRay superbuild
  • add MegaMol
  • the scripts need to be improved to allow mirroring the outside MPI configuration so that pass-through hardware access works properly, if I understood it correctly: https://sylabs.io/guides/3.5/user-guide/mpi.html It is unclear to me how closely they have to match though, I had MPICH 3.3.2-2build1 on WSL2/Ubuntu 20.04.3 LTS and MPICH 3.4.1-1.el inside the image (CentOS 85.2111).
  • maybe a multi-stage build is better than two separate definitions?
  • can we solve the CPIO issue?

Related Pull Requests

  • N/A

Resolved Issues

  • N/A

How Has This Been Tested?

So far, only the basic EOX example has been run, and that looked okay in MegaMol.

@FG-TUM
Copy link
Member

FG-TUM commented May 24, 2023

@reinago Any intention of picking this up at some point? Otherwise, we can just archive it...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants