Hello,
I’m looking for a way to couple OpenMC with an external solver. The basic idea is to let OpenMC calculate the power [W], so that the external solver could estimate the temperature. By so doing, a successive OpenMC iteration could be launched with the new temperature, finding the new power and so on.
From what I’ve understood in the documentation, the best tally to find a measure of the local power is heating-local
: if a tally over the entire fuel, I will know the score in [eV/source]. Then, I would need a normalization factor able to give me the [source/s]. However, this normalization factor is calculated knowing the power, which is the quantity I’m looking for.
Is there any other way to calculate the normalization factor, so that I don’t need to know a-priori the power?
Or, does the output value “Calculation Rate (active cycles) = x particles/second” may help?
Thank you
Lorenzo
Hello! OpenMC probably can’t give you the number you want in Watts because by definition, Monte Carlo can only ever give you a shape of the flux, not a magnitude. So it can give you a power distribution that you normalize to the total power you set.
You can define power with several different tallies. I’ve had good success using simply the “fission” tally, but you can also use “kappa-fission” or “fission-q-recoverable” depending how sophisticated the assumption you want. In my experience, I’ve used a known normalization factor of 66945.4 W for a fuel rod based on the power output of an entire reactor divided by the number of rods in the reactor.
For a cool example of changing temperatures and water densities in-memory in OpenMC, check out this notebook: Jupyter Notebook Viewer
How to convert unit ‘eV/source’ to ‘W/g’?
Hello, as mentioned in my reply marked “solution”, you will need to determine a scaling factor. For instance, this can be the power you expect from a single fuel rod in your geometry (this could be in W or W/g, units of your choice I believe). Then, for there, you tally the power in the same fuel rod and scale accordingly.