I have code that calculates the temperature increase of the coolant as it flows through the core based on the linear power of the rods that is being recovered from kappa-fission tallies. However, doing the mass flow rate * specific heat * delta_T check returns 2.546 MW instead of the 4 MW depletion power. This value is maintained exactly (more sig figs) even when significant parameters are changed but keeping the 4 MW power the same, there is no statistical fluctuation whatsoever even though I’m tallying with relatively low particles. Is OpenMC accounting for some kind of recoverable energy efficiency loss or something that is resulting in this underestimate?
Have you checked that your TH solver conserves energy when using dummy values for the power?
You should be able to calculate the total power from tallies before plugging it into your TH solver. That should help clarify things!
The core power level is conserved when the average linear power * core height * number of rods calculation is performed. It is only when the energy transferred to the coolant is checked that the power output falls by a set fraction. This fraction is the exact same regardless of the power level of the core and regardless of the number of axial tally segmentations used to calculate the temperature change. I ran a higher power test such that the average linear power was ~180 w/cm to match commercial power reactors and the expected + 30 C temperature rise was observed, but again the energy contained in that outlet flow was exactly 0.636 of the total core power.
If the problem is only when you transfer energy to the coolant, this is certainly a flaw that’s not in OpenMC, right? Or am I misunderstanding something?
I don’t think it is originating from OpenMC directly, no.
Well, this may not be the best place to debug it
However, notice that 0.636 is almost exactly 2/pi. I suspect you’re missing a factor of pi/2 somewhere. Good luck!