Estimating summary.h5 size and RAM requirements from a given model

Recently, I’ve been building a hefty model of the VTB GCMR for depletion purposes. One of the goals is to try and model the TRISO explicitly with high fidelity (e.g. allowing each fuel compact to deplete seperately, which is achieved by cloning the universes for each compact).

It is quite memory intensive to do this (too many surfaces :sob:), so we are trying to achieve some kind of middle ground. During this process, I’ve maxed the RAM on our university cluster a few times with the fully explicit model. This appears to happen during the Preparing distributed cell instances... step or the Writing summary.h5 file..., as best I can tell.

I’ve already tried idling 3/4 the threads on a node to give each thread more RAM, so I think the next step has to be model size reduction. We have a few ideas to cut down on the file sizes. The fully explicit TRISO model’s geometry.xml is 5 GB and the materials.xml is around 1MB. Both of which can probably be cut in ~three by exploiting symmetry and using periodic plane BCs. But this leads to my main question: is there any way to estimate the RAM required to load the model before a simulation, as well as to estimate the size of summary.h5 given all the normal XML inputs?

I plan to do everything I can do make the XML input smaller, but it would be really great to have estimates of RAM and disk space needed so that I can communicate that with out cluster admins for my next attempt at running this.