Skip to content

[DEV] MPI submission example, utils.py adjusted for mpi#50

Open
garcs2 wants to merge 2 commits intoIdahoLabResearch:mainfrom
garcs2:parallelization
Open

[DEV] MPI submission example, utils.py adjusted for mpi#50
garcs2 wants to merge 2 commits intoIdahoLabResearch:mainfrom
garcs2:parallelization

Conversation

@garcs2
Copy link

@garcs2 garcs2 commented Mar 19, 2026

This should now resolve #36, with mpi parallelization now available for depletion. This PR is contingent on this PR, which allows for more than one MPI rank within OpenMC to launch specifically for depletion. Example has been provided on mpi usage within an LTMR example found in examples/watts_exec_LTMR_mpi, with example submission script for users to see.

Importantly, given that openmc.run() and integrator.integrate() have fundamental differences in execution, there has to be a decision made on whether MPI is provided for depletion or singular eigenvalue calculations, but not both within the same script. For that reason openmc.run() has been commented out since it provides no utility as depletion is ran in every case inclusive of the SD Margin Calc and ITC. Uncommenting openmc.run() causes the job to crash since openmc.run spawns a subprocess that tries to initialize its own MPI environment which clashes with the parent mpirun process in the submisssion script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Add HPC cluster support of OpenMC runs with MPI execution

1 participant