A computational fluid dynamics simulation of subcooled flow boiling of water at 10.5
${\rm bar}$, with an applied heat flux of
$1\,{\rm MW}\,{\rm m}^{-2}$ and subcooling of 10
${\rm K}$, was performed using an interface tracking method. The simulation replicated the conditions of an experiment conducted at MIT. The objectives are to elucidate heat-transfer mechanisms in moderate-pressure subcooled boiling and to validate the simulation method, with a focus on quantities that are difficult to measure experimentally, such as the distributions of velocity, temperature, bubble number density and heat-flux partitioning. Due to the small bubble size under high pressure, fine grids are required. Simulated bubble shapes, wall temperatures and vapour area fractions show good agreement with the experimental results. The simulations reveal that a very thin liquid layer (
${\lt}4\,\unicode{x03BC}{\rm m}$) surrounding the bubbles is highly effective at removing heat from the surface. The local wall heat fluxes beneath medium and large bubbles, excluding the heat flux associated with seed-bubble generation, are approximately 0.9 and 0.4
${\rm MW}\,{\rm m}^{-2}$, respectively; the latter is smaller because of the presence of thicker liquid films (14–70
$\unicode{x03BC}{\rm m}$) that thermally insulate the wall. In the single-phase liquid region, the heat transfer coefficient reaches
$42\,{\rm kW}\,{\rm m}^{-2}\,{\rm K}^{-1}$ as a result of strong turbulent heat flux in the wall-normal direction; this turbulent heat flux is approximately eight times larger than in the equivalent single-phase liquid flow.