Skip to content

Conversation

@ahnaf-tahmid-chowdhury
Copy link

Update the build configuration to include additional directories and streamline the inclusion of dependencies. Remove outdated checks and ensure proper environment variable usage for NEKRS_HOME. Introduce new build options for related components like MOAB, DAGMC, and OpenMC. This enhances the overall build process and dependency management.

Copy link
Contributor

@nuclearkevin nuclearkevin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a few notes about changed default behaviour. April will need to add you to the neams-th-coe organization for tests to run.

@aprilnovak
Copy link
Collaborator

@ahnaf-tahmid-chowdhury, would you please re-push to this PR? This will trigger the test suite.

@ahnaf-tahmid-chowdhury
Copy link
Author

Not sure how to start the workflows.

@aprilnovak
Copy link
Collaborator

@ahnaf-tahmid-chowdhury are the recent commits trying to get around the test failure? If so, I can take a closer look to see if I can figure it out

@ahnaf-tahmid-chowdhury
Copy link
Author

@ahnaf-tahmid-chowdhury are the recent commits trying to get around the test failure? If so, I can take a closer look to see if I can figure it out

Yes, but I have undone this. Still not sure why the test is failing with the same version of NekRS. Maybe tweaking the iteration number will fix it. I have merged the upstream branch; let’s see what the difference is.

@moosebuild
Copy link
Collaborator

moosebuild commented Oct 8, 2025

Job Coverage, step Generate coverage on ddd2a8a wanted to post the following:

Coverage

Coverage did not change

Full coverage report

This comment will be updated on new commits.

@aprilnovak
Copy link
Collaborator

Sometimes NekRS is mysterious - but it looks like the tests passed now? Is this PR ready for review, or are there more changes coming?

@ahnaf-tahmid-chowdhury
Copy link
Author

Yes, it’s ready for review. I’ll be happy to open another PR if I find any new tweaks later.

@aprilnovak
Copy link
Collaborator

Thanks @ahnaf-tahmid-chowdhury ! Starting to review and test this across some different systems

@nuclearkevin
Copy link
Contributor

Apologies for the wait - I've been slammed over the last few weeks and I've only just started clawing through my backlog. I'm testing this on a few different machines at Argonne to ensure our build instructions still function on their systems, and building with these changes on my personal laptop. Will drop a review once that's done!

Copy link
Contributor

@nuclearkevin nuclearkevin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Only one minor comment from me. I've been able to build on Improv @ ANL (all dependencies) and my personal machine (everything other then NekRS) with no issues.

Copy link
Contributor

@nuclearkevin nuclearkevin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On a surface level this looks good to me now - thanks for your patience over the last month!

Someone in our research group was running into issues building this branch on WindRiver @ INL HPC. I'll give it shot over there and report back if it has anything to do with this PR or not (INL has a habit of getting rid of modules without letting people know, so it could be that).

@aprilnovak
Copy link
Collaborator

@nuclearkevin any issues on INL HPC? This looks good to me for merge if you can confirm this test was done successfully

@aprilnovak
Copy link
Collaborator

@ahnaf-tahmid-chowdhury actually, would you please add something to the without_conda.md and with_conda.md pages to document the BUILD_... options?

that follow.

!alert! tip title=Using OpenMC from your Conda Environment
You can install OpenMC directly into your conda environment. See the "Using OpenMC from your Conda Environment" section for more details.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ahnaf-tahmid-chowdhury are you using the MOOSE conda environment with OpenMC installed into it?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After looking into it more deeply, I realized that the conda version of the cardinal build isn’t actually being used yet. It only pulls in hdf5 from conda, and Cardinal seems to build all the moose packages by itself anyway. So there’s really no need for a separate moose environment with MOOSE already installed. We can still use openmc from the conda environment, as long as we choose a version that’s compatible with cardinal. In short, I haven’t found any real benefit to having a moose environment, since the current build script doesn’t use any moose deps from it.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess my request would be to clarify the difference between conda environments in the Cardinal documentation. The "with/without" conda is referring to MOOSE's conda environment, and then building OpenMC from source as part of that. But I think you've written the docs also assuming someone is using OpenMC's conda. I'd suggest we leave it as just MOOSE's conda environemnt, then perhaps have a hyperlink to another page which clarifies that you can also use OpenMC's conda environment on top of that?

The MOOSE conda environment allows us to skip building libMesh, PETSc, and wasp. These take significant amount of time.

Copy link
Contributor

@nuclearkevin nuclearkevin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything works as expected on INL HPC (WindRiver/Bitterroot). The 67 Pebble CHT case runs with no issues, and the pincell multiphysics problem also runs fine. Tested with 2 nodes (224 MPI ranks).

```
conda activate moose
conda activate cardinal
conda install -c conda-forge openmc # Install OpenMC from conda-forge
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please update this documentation so that it does not rely on the OpenMC conda environment? Let's stick with the original MOOSE conda environment since it includes libMesh & wasp & petsc and simultaneously has all the OpenMC dependencies already

./scripts/get-dependencies.sh
export ENABLE_NEK=false
export HDF5_ROOT=$CONDA_PREFIX
export OPENMC_DIR=$CONDA_PREFIX
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggest to move this to a "tip" box, if anyone wants to use OpenMC conda and MOOSE conda simultaneously

!alert tip

!include access.md

To get MOOSE's conda environment, follow the instructions [here](https://mooseframework.inl.gov/getting_started/installation/index.html).
To get started, you'll need to create a dedicated conda environment for Cardinal. This ensures that all the necessary dependencies are isolated and managed correctly.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not a conda expert, but I don't think this is truly necessary? I'd prefer to avoid any customized instructions beyond just what MOOSE requires (but please educate me if there is an advantage to doing this).

@ahnaf-tahmid-chowdhury
Copy link
Author

Sorry for the delay. I’ve been busy. I should be able to start working on this next week.

@ahnaf-tahmid-chowdhury
Copy link
Author

I have undone my changes to with_conda.md. I may create a separate PR once I have successfully built Conda MOOSE with Conda OpenMC.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants