Dear Paul,
Further to our discussion here i am currently working on converting the damage-energy score in [ev/source] to [displacement/sec]. Firstly, I am not very clear on how this tallied score in [ev/source] could be converted to [ev/sec]. As suggested by you here, i will be required to find the normalization factor f
in order to find the energy quantity in [ev/sec]. Considering that the damage energy tallied from the score is D [ev/source], by saying D/f
we can arrive at the right unit [ev/sec]. In order to do that we must be calculating the equivalent quantities D_dash
and P
. I am eager to know from you, how we can calculate D_dash
from D
and the energy deposited per unit time P
. Subsequent conversion to DPA
could then be straight forward.
Kind regards,
Arun
Not sure if Iâve understood correctly but here is a DPA example Iâve made in the past. Might be helpful
Dear Patrick,
Thanks for the example. Will look in to that.
Kind regards,
Arun
Dear Patrick,
I tried running your code and received the following error. I guess the openMC execution throws up this error while writing the âsummary.h5â file. The first execution of OpenMC proceeded without any errors. Probably you mightâve some suggestions.
/home/ir-bala2/.local/lib/python3.7/site-packages/IPython/core/display.py:724: UserWarning: Consider using IPython.display.IFrame instead
warnings.warn("Consider using IPython.display.IFrame instead")
--------------------------------------------------------------------------
By default, for Open MPI 4.0 and later, infiniband ports on a device
are not used by default. The intent is to use UCX for these devices.
You can override this policy by setting the btl_openib_allow_ib MCA parameter
to true.
Local host: login-e-12
Local adapter: hfi1_0
Local port: 1
--------------------------------------------------------------------------
--------------------------------------------------------------------------
WARNING: There was an error initializing an OpenFabrics device.
Local host: login-e-12
Local device: hfi1_0
--------------------------------------------------------------------------
%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
############### %%%%%%%%%%%%%%%%%%%%%%%%
################## %%%%%%%%%%%%%%%%%%%%%%%
################### %%%%%%%%%%%%%%%%%%%%%%%
#################### %%%%%%%%%%%%%%%%%%%%%%
##################### %%%%%%%%%%%%%%%%%%%%%
###################### %%%%%%%%%%%%%%%%%%%%
####################### %%%%%%%%%%%%%%%%%%
####################### %%%%%%%%%%%%%%%%%
###################### %%%%%%%%%%%%%%%%%
#################### %%%%%%%%%%%%%%%%%
################# %%%%%%%%%%%%%%%%%
############### %%%%%%%%%%%%%%%%
############ %%%%%%%%%%%%%%%
######## %%%%%%%%%%%%%%
%%%%%%%%%%%
| The OpenMC Monte Carlo Code
Copyright | 2011-2022 MIT, UChicago Argonne LLC, and contributors
License | https://docs.openmc.org/en/latest/license.html
Version | 0.14.0-dev
Git SHA1 | e535d17ae092182ca373e9136e4036bb308901c5
Date/Time | 2022-10-07 19:17:15
MPI Processes | 1
OpenMP Threads | 1
Reading settings XML file...
Reading cross sections XML file...
Reading materials XML file...
Reading geometry XML file...
Reading Fe54 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe54.h5
Reading Fe56 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe56.h5
Reading Fe57 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe57.h5
Reading Fe58 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe58.h5
Minimum neutron data temperature: 294 K
Maximum neutron data temperature: 294 K
Reading tallies XML file...
Preparing distributed cell instances...
Reading plot XML file...
Writing summary.h5 file...
Maximum neutron transport energy: 20000000 eV for Fe58
===============> FIXED SOURCE TRANSPORT SIMULATION <===============
Simulating batch 1
Simulating batch 2
Simulating batch 3
Simulating batch 4
Simulating batch 5
Simulating batch 6
Simulating batch 7
Simulating batch 8
Simulating batch 9
Simulating batch 10
Creating state point statepoint.10.h5...
=======================> TIMING STATISTICS <=======================
Total time for initialization = 7.7453e-01 seconds
Reading cross sections = 4.5877e-01 seconds
Total time in simulation = 2.6028e+01 seconds
Time in transport only = 2.6022e+01 seconds
Time in active batches = 2.6028e+01 seconds
Time accumulating tallies = 6.5967e-05 seconds
Time writing statepoints = 5.3552e-03 seconds
Total time for finalization = 1.3536e-03 seconds
Total time elapsed = 2.6807e+01 seconds
Calculation Rate (active) = 3842.05 particles/second
============================> RESULTS <============================
Leakage Fraction = 0.00001 +/- 0.00001
Damage energy deposited per source neutron = 741347.448948148 eV
Two times the threshold energy of 40eV is needed to displace an atom
Displacements per source neutron = 9266.843111851851
Assuming about 80% remains after 20% recombine to original lattice locations
Displacements per source neutron after recombination = 7413.474489481481
Number of neutrons per second 1.0638913571168374e+21
Number of neutrons per full power year 3.357385789135031e+28
Displacements for all atoms in the volume 2.4889893899100205e+32
Now the number of atoms in the volume must be found to find displacements per atom (DPA)
--------------------------------------------------------------------------
By default, for Open MPI 4.0 and later, infiniband ports on a device
are not used by default. The intent is to use UCX for these devices.
You can override this policy by setting the btl_openib_allow_ib MCA parameter
to true.
Local host: login-e-12
Local adapter: hfi1_0
Local port: 1
--------------------------------------------------------------------------
--------------------------------------------------------------------------
WARNING: There was an error initializing an OpenFabrics device.
Local host: login-e-12
Local device: hfi1_0
--------------------------------------------------------------------------
%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
############### %%%%%%%%%%%%%%%%%%%%%%%%
################## %%%%%%%%%%%%%%%%%%%%%%%
################### %%%%%%%%%%%%%%%%%%%%%%%
#################### %%%%%%%%%%%%%%%%%%%%%%
##################### %%%%%%%%%%%%%%%%%%%%%
###################### %%%%%%%%%%%%%%%%%%%%
####################### %%%%%%%%%%%%%%%%%%
####################### %%%%%%%%%%%%%%%%%
###################### %%%%%%%%%%%%%%%%%
#################### %%%%%%%%%%%%%%%%%
################# %%%%%%%%%%%%%%%%%
############### %%%%%%%%%%%%%%%%
############ %%%%%%%%%%%%%%%
######## %%%%%%%%%%%%%%
%%%%%%%%%%%
| The OpenMC Monte Carlo Code
Copyright | 2011-2022 MIT, UChicago Argonne LLC, and contributors
License | https://docs.openmc.org/en/latest/license.html
Version | 0.14.0-dev
Git SHA1 | e535d17ae092182ca373e9136e4036bb308901c5
Date/Time | 2022-10-07 19:17:43
MPI Processes | 1
OpenMP Threads | 1
Reading settings XML file...
Reading cross sections XML file...
Reading materials XML file...
Reading geometry XML file...
Reading Fe54 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe54.h5
Reading Fe56 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe56.h5
Reading Fe57 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe57.h5
Reading Fe58 from
/home/ir-bala2/rds/rds-ukaea-ap001/arunpb/endfb71_hdf5/Fe58.h5
Minimum neutron data temperature: 294 K
Maximum neutron data temperature: 294 K
Reading tallies XML file...
Preparing distributed cell instances...
Reading plot XML file...
Writing summary.h5 file...
HDF5-DIAG: Error detected in HDF5 (1.12.0) MPI-process 0:
#000: H5F.c line 705 in H5Fcreate(): unable to create file
major: File accessibility
minor: Unable to open file
#001: H5VLcallback.c line 3393 in H5VL_file_create(): file create failed
major: Virtual Object Layer
minor: Unable to create file
#002: H5VLcallback.c line 3358 in H5VL__file_create(): file create failed
major: Virtual Object Layer
minor: Unable to create file
#003: H5VLnative_file.c line 65 in H5VL__native_file_create(): unable to create file
major: File accessibility
minor: Unable to open file
#004: H5Fint.c line 1622 in H5F_open(): unable to lock the file
major: File accessibility
minor: Unable to open file
#005: H5FD.c line 1675 in H5FD_lock(): driver lock request failed
major: Virtual File Layer
minor: Can't update object
#006: H5FDsec2.c line 959 in H5FD_sec2_lock(): unable to lock file, errno = 11, error message = 'Resource temporarily unavailable'
major: File accessibility
minor: Bad file ID accessed
ERROR: Failed to open HDF5 file with mode 'w': summary.h5
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode -1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Traceback (most recent call last):
File "dpa.py", line 139, in <module>
openmc.run()
File "/home/ir-bala2/.local/lib/python3.7/site-packages/openmc/executor.py", line 280, in run
_run(args, output, cwd)
File "/home/ir-bala2/.local/lib/python3.7/site-packages/openmc/executor.py", line 118, in _run
raise RuntimeError(error_msg)
RuntimeError: Failed to open HDF5 file with mode 'w': summary.h5 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode -1. NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes. You may or may not see output from other processes, depending on exactly when Open MPI kills them. --------------------------------------------------------------------------
Kind regards,
Arun
Thanks for posting.
It looks like a summary file already exists (perhaps from a previous run) a quick fix would be to delete it before running.
notebook terminal command using !
!rm summary.h5
or python old method
import os
os.system('rm summary.h5'
or newer python
import subprocess
subprocess.call(['rm','summary.h5'])
@abarun, I hope Jonâs suggestion got you farther along. I just wanted to add a little bit to the discussion.
My guess is that this error is occurring due to an open openmc.StatePoint
object on your system in a Python interpreter. These objects claim the file handle of the HDF5 statepoint file (statepoint.10.h5
for example) and if a summary.h5
file is present, it may claim that handle as well.
When a new OpenMC run executes, one will see this error when it OpenMC fails to write this new summary.h5
file due to the already-open file handle. The best practice to avoid this situation is to ensure that all openmc.StatePoint
objects are closed (using the .close
method) before executing another OpenMC run. Please see additional notes on this in this section of the documentation.
Dear Jon & Patrick,
Thanks for the suggestions. Both of them worked well. Moving forward i will need to estimate the rate of swelling in my component model influenced by the DPA and the irradiation temperature. Look here for a brief introduction of the work i am currently associated with. The heat loads derived from score âheating-localâ will be mapped to the FE quadrature points as irradiation temperature. The DPA as can be seen from the python code is a scalar variable. What i am interested to hear from you is that whether this DPA can be visualised as a discrete variable that depends on the spatial locations of the openmc model? If that makes any sense, i can then map these values on to the QP locations of the FE model. Let me know your suggestions on the valid usage of DPA that best decribes the swelling of my component.
Kind regards,
Arun
@abarun when you tally heating over a spatial region, OpenMC is giving you the integrated quantity, which doesnât include any information on the spatial dependence across that region. If you have the heating tallied in many regions, it would be up to you to come up with a continuous representation that could be evaluated at individual points if you want anything other than a piecewise constant function.
Dear Patrick,
When I frist use openmc in Linux,I meet a probleme. It appears " RuntimeError: Failed to open HDF5 file with mode ârâ ". I hope you can help me.Thanke you for me.
Wangkailong
This error is discussed in a separate thread:
Dear Shimwell,
Thanks for this DPA example. Currently I am trying to adapt this code to be able to work with a dagmc based geometry. The inclusion of following line will infact impact the code relating to creation of tally filter and volume calculation.
openmc.DAGMCUniverse(filename='dagmc.h5m')
I have not figured out how this might work. If there is a related example that would definitely help.
Thanks and regards,
Arun
There are a few DAGMC with Openmc examples available for you. They tend to do different tallies to DPA so you will have to transfer your DPA tally skills to this new geometry type.
I tend to perform tallies using cellFilter in CSG mode but switch to materialFilters for DAGMC
The DAGMC geometry can be a bit of a black box so it helps to be able to inspect it. We are building up the native openmc inspection capabilities but currently we canât do linked cell and material queries. So I still use this mico package
Task 11 and 12 here
Magnetic fusion reactor example
Inertial fusion reactor example
Cad based geometry example notebook
Dear Shimwell,
Thanks for the reference. I was running the CAD based example and end up with the following error.
ERROR: DAGMC Universes are present but OpenMC was not configuredwith DAGMC
Does this need any special configuration for usage with DAGMC?
Kind regards,
Arun
Hi Abarun
Yes there are some DAGMC specific steps to the install.
Installing OpenMC with DAGMC enabled came up quite recently on another thread and got answered by Patrick, here is a link to that thread
Dear Shimwell,
Thanks for the references. I was trying to run a dagmc based model, but ended up with an error. I thought it is better to inspect the geometry first to see if the they are consistent with the simulation. I installed the python package âdagmc-h5m-file-inspectorâ through the conda route as shown in the instructions. âPymoabâ for some reasons could not be installed, as evident from the error ModuleNotFoundError: No module named 'pymoab'
and its absence in the list of packages resulting from conda list
. Probably something is going wrong here. Any help here would be appreciated.
Kind regards,
Arun
Thanks for having a go with Dagmc-file-inspector and reaching out
I just checked the repo and I can see Moab (pymoab) is in the meta. I can check a bit more when I get to a computer again.
In the meantime is there any chance you installed to a different virtually environment or didnât activate the environment.
All the best
Looks better after doing the installation inside a conda virtual environment. The following two modules are what we are looking for.
dagmc_h5m_file_inspector 0.5.0 py37_0 fusion-energy
moab 4.9.1 0 conda-forge
The python interpretor still does not find the module âpymoabâ. I am not sure if i am missing something here.
I have installed the package over here and it installed moab 5.4.1
I have improved the install instructions on the repo a bit, perhaps this is clearer GitHub - fusion-energy/dagmc_h5m_file_inspector: Extracts information from DAGMC h5m files including volumes number, material tags
Thanks Shimwell. This works.
Dear Shimwell,
I am currently using this code to calculate the DPA for a representative dagmc geometry. I suppose that the damage energy extracted from the tally âdamage-energyâ is obtained in discrete spatial locations and can be stored in mesh bins. As a consequence the DPA quantity derived from this energy can as well be calculated in a discrete form. I am particularly concerned about the variable representing the total number of atoms (number_of_iron_atoms), as how this can be accomodated under this discrete set up. You might want to calculate the component volume locally specific to each defect configuration before you can calculate the DPA. Any idea/examples here would be very much appreciated.
Kind regards,
Arun