Openmc discourse forum
Hello all, I’m pretty new to using OpenMC and I am trying to run a depletion calculation on a HPC cluster with MPI and am having some issues. OpenMC and MPI are both installed on a Docker image I’m launching through Apptainer on the cluster. The problem may be with h5py, in which case sorry if this is the wrong place to ask.
I am trying to run a simple scenario in which I use add_transfer_rate to move some material from one cylinder to another in order. When I run this without using MPI (python FILENAME.py), it runs without issue. When I try to run it with the command given in the documentation (mpirun -n 4 python FILENAME.py) ((the same errors occur with mpirun and mpiexec), I get a variety of errors. With each attempt, the errors show up at different places and are different errors entirely sometimes. I have checked that h5py is installed with parallel capabilities (attached image), so it shouldn’t be that. However, the error messages do look like it’s having trouble operating in parallel.
Here is two sample error messages along with my code
Traceback (most recent call last):
File "flowtest.py", line 61, in <module>
integrator.integrate()
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/abc.py", line 838, in integrate
StepResult.save(self.operator, n_list, res_list, [t, t + dt],
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/stepresult.py", line 570, in save
results.export_to_hdf5(path, step_ind)
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/stepresult.py", line 269, in export_to_hdf5
with h5py.File(filename, **kwargs) as handle:
File "/usr/local/lib/python3.8/dist-packages/h5py/_hl/files.py", line 562, in __init__
fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr)
File "/usr/local/lib/python3.8/dist-packages/h5py/_hl/files.py", line 241, in make_fid
fid = h5f.create(name, h5f.ACC_TRUNC, fapl=fapl, fcpl=fcpl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 122, in h5py.h5f.create
BlockingIOError: [Errno 11] Unable to synchronously create file (unable to lock file, errno = 11, error message = 'Resource temporarily unavailable')
I have tried setting the HDF5_USE_FILE_LOCKING environment variable to FALSE, and it simply returned a different error:
Traceback (most recent call last):
File "flowtest.py", line 61, in <module>
integrator.integrate()
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/abc.py", line 838, in integrate
StepResult.save(self.operator, n_list, res_list, [t, t + dt],
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/stepresult.py", line 570, in save
results.export_to_hdf5(path, step_ind)
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/stepresult.py", line 271, in export_to_hdf5
res._to_hdf5(handle, step, parallel=False)
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/stepresult.py", line 368, in _to_hdf5
self._write_hdf5_metadata(handle)
File "/usr/local/lib/python3.8/dist-packages/openmc/deplete/stepresult.py", line 321, in _write_hdf5_metadata
rxn_group = handle.create_group("reactions")
File "/usr/local/lib/python3.8/dist-packages/h5py/_hl/group.py", line 64, in create_group
gid = h5g.create(self.id, name, lcpl=lcpl, gcpl=gcpl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5g.pyx", line 166, in h5py.h5g.create
ValueError: Unable to synchronously create group (bad local heap signature)
Has anyone encountered similar errors?
import openmc
import numpy as np
import openmc.deplete
import h5py
def ZEquals(z,boundary_type):
return openmc.ZPlane(z0=z,boundary_type=boundary_type)
u238 = openmc.Material()
u238_dict= {‘U238’:1.0}
u238.add_components(u238_dict)
u238.set_density('g/cm3',19)
u238.temperature = 873
u238.depletable = True
offgas = openmc.Material()
offgas_dict = {'He':1.0}
offgas.add_components(offgas_dict)
offgas.set_density('g/cm3',0.001)
offgas.temperature = 293
offgas.depletable=True
mats = openmc.Materials([u238,offgas])
mats.export_to_xml()
setup=openmc.Universe()
cyl1 = -openmc.Cylinder(r=50,x0=-50,boundary_type='vacuum') & +ZEquals(-1000,boundary_type='vacuum') & -ZEquals(1000,boundary_type='vacuum')
cyl2 = -openmc.Cylinder(r=30,x0=50,boundary_type='vacuum') & +ZEquals(0,boundary_type='vacuum') & -ZEquals(5,boundary_type='vacuum')
cell1 = openmc.Cell(fill=u238,region=cyl1)
cell2 = openmc.Cell(fill=offgas,region=cyl2)
setup.add_cells([cell1,cell2])
geometry = openmc.Geometry(root=setup)
geometry.export_to_xml()
settings = openmc.Settings()
settings.run_mode = 'eigenvalue'
settings.temperature = {'method':'interpolation','range':(293.15,923.15)} #unsure
settings.batches = 25
settings.inactive = 10
settings.particles = 100000
settings.generations_per_batch = 10
settings.photon_transport = False
settings.export_to_xml()
model = openmc.model.Model(geometry,mats,settings)
model = openmc.model.Model(geometry,mats,settings)
#Depletion general settings
depletion_days = [100,100,100]
power =0#total thermal power [W]
u238.volume =(50**2)*2000*np.pi/u238.density #cm3
u238_mass = u238.volume*u238.density #grams
offgas.volume=(50**2*np.pi*5)
op = openmc.deplete.CoupledOperator(model, normalization_mode = "energy-deposition", chain_file='chain_endfb80_pwr.xml') # MAY NEED MSR CHAIN
integrator = openmc.deplete.CECMIntegrator(op, depletion_days, timestep_units='d', power=power)
integrator.add_transfer_rate(u238,components=[‘U’],transfer_rate=5E-3,destination_material=offgas,transfer_rate_units='1/d') #default transfer rate units is 1/s
integrator.integrate()