The result is: OSError: Error reading file ‘chain’_ casl.xml ‘: failed to load external entity "chain_ casl.xml "
And then I added chain_file.export_to_xml()
Result is : AttributeError: ‘str’ object has no attribute ‘export_to_xml’.
Did I download some less data? Or does my program need to be modified?
@SaitoAsuka If the external entity that you’re trying to load in that case chain_casl.xml is not on your current folder, then the error would be caused by that.
Thank you very much. I can now run the burnup simulator successfully.
For data analysis, is there any method for me to get the atomic concentration and keff of each timestep in burnup?
OpenMC does not have a way to generate MATLAB files. If you want to convert to something that can be read with MATLAB, you can use scipy.io.savemat after pulling the data from the HDF result file. The routines in the pincell depletion example, linked above, would be a good starting point.
I don’t think it’s necessary to generate .m files. For example, openmc will generate a deletion_results.h5 file after running burnup simulation, and Matlab can read the data in h5 file through this code
data = h5read(‘data4torch.h5’,‘/data’);
where ‘/data’ is the dataset in the h5 file. And now I’m confused about which dataset is corresponding to keff and atomic concentration in the h5 file.
First read entire h5 file
>> h5disp('depletion_results.h5');
and then read the dataset e.g `eigenvalues`
>> h5read('depletion_results.h5','/eigenvalues');