Error when writing Libmesh exodus file

Hello,

I am trying to get libmesh to work properly so i can read and write tallies to a .e file. However, I am running into a writing error when openmc tries to write the tallies to the mesh. Below is the output and attached is the DAGMC file, exodus file and the code im trying to run.

Preparing distributed cell instances…
Writing summary.h5 file…
Maximum neutron transport energy: 20000000 eV for F19

===============> FIXED SOURCE TRANSPORT SIMULATION <===============

Simulating batch 1
Simulating batch 2
Simulating batch 3
Simulating batch 4
Simulating batch 5
Simulating batch 6
Simulating batch 7
Simulating batch 8
Simulating batch 9
Simulating batch 10
Creating state point statepoint.10.h5…
*** Warning, This code is deprecated, and likely to be removed in future library versions! /usr/local/include/libmesh/node.h, line 265, compiled Oct 18 2024 at 15:35:44 ***
Writing file: tally_3.10.e for unstructured mesh 1
libMesh terminating:
map_find() error: key “255” not found in file …/src/mesh/exodusII_io_helper.C on line 544
Stack frames: 18
0: libMesh::print_trace(std::ostream&)
1: libMesh::MacroFunctions::report_error(char const*, int, char const*, char const*, std::ostream&)
2: std::map<int, std::map<libMesh::ElemType, libMesh::ExodusII_IO_Helper::Conversion, std::lesslibMesh::ElemType, std::allocator<std::pair<libMesh::ElemType const, libMesh::ExodusII_IO_Helper::Conversion> > >, std::less, std::allocator<std::pair<int const, std::map<libMesh::ElemType, libMesh::ExodusII_IO_Helper::Conversion, std::lesslibMesh::ElemType, std::allocator<std::pair<libMesh::ElemType const, libMesh::ExodusII_IO_Helper::Conversion> > > > > >::mapped_type const& libMesh::Utility::map_find<std::map<int, std::map<libMesh::ElemType, libMesh::ExodusII_IO_Helper::Conversion, std::lesslibMesh::ElemType, std::allocator<std::pair<libMesh::ElemType const, libMesh::ExodusII_IO_Helper::Conversion> > >, std::less, std::allocator<std::pair<int const, std::map<libMesh::ElemType, libMesh::ExodusII_IO_Helper::Conversion, std::lesslibMesh::ElemType, std::allocator<std::pair<libMesh::ElemType const, libMesh::ExodusII_IO_Helper::Conversion> > > > > >, int, (int*)0>(std::map<int, std::map<libMesh::ElemType, libMesh::ExodusII_IO_Helper::Conversion, std::lesslibMesh::ElemType, std::allocator<std::pair<libMesh::ElemType const, libMesh::ExodusII_IO_Helper::Conversion> > >, std::less, std::allocator<std::pair<int const, std::map<libMesh::ElemType, libMesh::ExodusII_IO_Helper::Conversion, std::lesslibMesh::ElemType, std::allocator<std::pair<libMesh::ElemType const, libMesh::ExodusII_IO_Helper::Conversion> > > > > > const&, int const&, char const*, int)
3: libMesh::ExodusII_IO_Helper::get_conversion(libMesh::ElemType) const
4: libMesh::ExodusII_IO_Helper::write_elements(libMesh::MeshBase const&, bool)
5: libMesh::ExodusII_IO::write_nodal_data_common(std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::vector<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::allocator<std::__cxx11::basic_string<char, std::char_traits, std::allocator > > > const&, bool)
6: libMesh::ExodusII_IO::write_nodal_data_discontinuous(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, std::vector<double, std::allocator > const&, std::vector<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::allocator<std::__cxx11::basic_string<char, std::char_traits, std::allocator > > > const&)
7: libMesh::ExodusII_IO::write_discontinuous_exodusII(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&, libMesh::EquationSystems const&, std::set<std::__cxx11::basic_string<char, std::char_traits, std::allocator >, std::less<std::__cxx11::basic_string<char, std::char_traits, std::allocator > >, std::allocator<std::__cxx11::basic_string<char, std::char_traits, std::allocator > > > const*)
8: openmc::LibMesh::write(std::__cxx11::basic_string<char, std::char_traits, std::allocator > const&) const
9: openmc::write_unstructured_mesh_results()
10: openmc_statepoint_write
11: openmc::finalize_batch()
12: openmc_next_batch
13: openmc_run
14: openmc(+0x12cd) [0x563e0ba9a2cd]
15: /lib/x86_64-linux-gnu/libc.so.6(+0x2a1ca) [0x7f69d0da01ca]
16: __libc_start_main
17: openmc(+0x1315) [0x563e0ba9a315]
[0] ./include/libmesh/utility.h, line 169, compiled Oct 18 2024 at 14:21:43


MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
Proc: [[53640,0],0]
Errorcode: 1

NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.

Traceback (most recent call last):
File “/home/travis/Documents/TestSphere/Test_Sphere.py”, line 132, in
openmc.run()
File “/home/travis/.local/lib/python3.12/site-packages/openmc/executor.py”, line 314, in run
_run(args, output, cwd)
File “/home/travis/.local/lib/python3.12/site-packages/openmc/executor.py”, line 125, in _run
raise RuntimeError(error_msg)
RuntimeError: OpenMC aborted unexpectedly.

1 Like

Experiencing the exact same problem here!

It works if you revert libmesh to an earlier version, I believe this version or earlier: libMesh v1.7.1

This fixes the problem while writing the file but now when I open it in paraview I get a bunch of errors. Here is the output of my sim and the errors.

Minimum neutron data temperature: 294 K
Maximum neutron data temperature: 294 K
Preparing distributed cell instances…
Writing summary.h5 file…
Maximum neutron transport energy: 20000000 eV for F19

===============> FIXED SOURCE TRANSPORT SIMULATION <===============

Simulating batch 1
Simulating batch 2
Simulating batch 3
Simulating batch 4
Simulating batch 5
Simulating batch 6
Simulating batch 7
Simulating batch 8
Simulating batch 9
Simulating batch 10
Creating state point statepoint.10.h5…
*** Warning, This code is deprecated, and likely to be removed in future library versions! /usr/local/include/libmesh/node.h, line 265, compiled Nov 4 2024 at 11:11:10 ***
Writing file: tally_3.10.e for unstructured mesh 1

=======================> TIMING STATISTICS <=======================

Total time for initialization = 1.3866e+02 seconds
Reading cross sections = 1.4426e+00 seconds
Total time in simulation = 4.0782e+01 seconds
Time in transport only = 3.5464e+00 seconds
Time in active batches = 4.0782e+01 seconds
Time accumulating tallies = 8.4497e-02 seconds
Time writing statepoints = 3.7151e+01 seconds
Total time for finalization = 8.7482e+00 seconds
Total time elapsed = 1.8823e+02 seconds
Calculation Rate (active) = 245.205 particles/second

============================> RESULTS <============================

Leakage Fraction = 0.23365 +/- 0.00453

upon further investigation I think it is the number of dimentions that the openmc output file specifies. when inspecting the tally_2.10.e we can see that num_dim = 255 and in paraview dimentions 3 through 254 are what have bad coordinate index. This means that 3 dimentions 1, 2, and 255 are ok. When inspecting the original exodus file num_dim = 3. Does anyone know how to fix this issue?

Before Openmc
ncdump -h Test_Sphere.e
netcdf Test_Sphere {
dimensions:
len_name = 256 ;
time_step = UNLIMITED ; // (0 currently)
num_dim = 3 ;
num_nodes = 6827 ;
num_elem = 35882 ;
num_el_blk = 1 ;
num_side_sets = 1 ;
num_el_in_blk1 = 35882 ;
num_nod_per_el1 = 4 ;
num_side_ss1 = 3030 ;
num_df_ss1 = 9090 ;
num_qa_rec = 1 ;
four = 4 ;
len_string = 33 ;
variables:
double time_whole(time_step) ;
int eb_status(num_el_blk) ;
int eb_prop1(num_el_blk) ;
eb_prop1:name = “ID” ;
int ss_status(num_side_sets) ;
int ss_prop1(num_side_sets) ;
ss_prop1:name = “ID” ;
double coordx(num_nodes) ;
double coordy(num_nodes) ;
double coordz(num_nodes) ;
char eb_names(num_el_blk, len_name) ;
eb_names:_FillValue = “” ;
char ss_names(num_side_sets, len_name) ;
ss_names:_FillValue = “” ;
char coor_names(num_dim, len_name) ;
coor_names:_FillValue = “” ;
int connect1(num_el_in_blk1, num_nod_per_el1) ;
connect1:elem_type = “TETRA” ;
int elem_ss1(num_side_ss1) ;
int side_ss1(num_side_ss1) ;
double dist_fact_ss1(num_df_ss1) ;
char qa_records(num_qa_rec, four, len_string) ;
int elem_map(num_elem) ;
int elem_num_map(num_elem) ;
int node_num_map(num_nodes) ;

// global attributes:
:api_version = 8.03f ;
:version = 8.03f ;
:floating_point_word_size = 8 ;
:file_size = 1 ;
:maximum_name_length = 32 ;
:int64_status = 0 ;
:title = “cubit(C:/Users/tarv/Desktop/Test_Sphere.e): 11/04/2024: 18:50:43” ;
}

After Openmc
ncdump -h tally_2.10.e
netcdf tally_2.10 {
dimensions:
len_string = 33 ;
len_line = 81 ;
four = 4 ;
time_step = UNLIMITED ; // (1 currently)
len_name = 33 ;
num_dim = 255 ;
num_nodes = 143528 ;
num_elem = 35882 ;
num_el_blk = 1 ;
num_side_sets = 1 ;
num_el_in_blk1 = 35882 ;
num_nod_per_el1 = 4 ;
num_side_ss1 = 3030 ;
num_nod_var = 2 ;
variables:
double time_whole(time_step) ;
int eb_status(num_el_blk) ;
int eb_prop1(num_el_blk) ;
eb_prop1:name = “ID” ;
int ss_status(num_side_sets) ;
int ss_prop1(num_side_sets) ;
ss_prop1:name = “ID” ;
double coordx(num_nodes) ;
double coordy(num_nodes) ;
double coordz(num_nodes) ;
char eb_names(num_el_blk, len_name) ;
char ss_names(num_side_sets, len_name) ;
char coor_names(num_dim, len_name) ;
int connect1(num_el_in_blk1, num_nod_per_el1) ;
connect1:elem_type = “TETRA4” ;
int elem_num_map(num_elem) ;
int elem_ss1(num_side_ss1) ;
int side_ss1(num_side_ss1) ;
double vals_nod_var1(time_step, num_nodes) ;
double vals_nod_var2(time_step, num_nodes) ;
char name_nod_var(num_nod_var, len_name) ;

// global attributes:
:api_version = 5.22f ;
:version = 5.22f ;
:floating_point_word_size = 8 ;
:file_size = 1 ;
:int64_status = 0 ;
:title = “tally_2.10.e” ;
:maximum_name_length = 32 ;
}

Hi, did you have any more luck with this error?

I have not had anymore luck on this error. What I proceeded to do is to compile MOAB with netCDF support then recompile openmc. Once this is done instead of using libmesh you can just use moab for your unstructured mesh. there will be an error that pops up saying the exodus file is not HDF5 but will run anyway. From that you can just follow the user guide on how to collect the tally data from the statepoint file and write a vtk file.