(Replying to PARENT post)

Increasing temperatures increase steam pressure until a rupture occurs, and the water boils away. The steel reactor vessel and structural elements will then inevitably melt, and the reaction will continue until the molten fuel can spread out enough to lose criticality. This is called a meltdown.
πŸ‘€blake1πŸ•‘5yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

This is why many new reactor designs do not use water as coolant but instead molten salt[0] or inert gas[1], to avoid the risk of steam explosion. This alone does not make those reactors inherently safe but it can help keep radioactive material from being violently spread outside the reactor in the event of catastrophic failure.

[0] https://en.wikipedia.org/wiki/Molten_salt_reactor

[1] https://en.wikipedia.org/wiki/Gas-cooled_reactor

πŸ‘€exrookπŸ•‘5yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

> Increasing temperatures increase steam pressure until a rupture occurs, and the water boils away.

Generally, no. Chernobyl is the biggest example of this happening, but it only happened because the reactor itself was basically exploding anyway.

Steam pressure doesn't build up gradually, generally. Relief valves -even just burst points- are just too simple and robust. The much more common issue is that hot reactor fuel causes water to break down and release hydrogen which accumulates at the top of the reactor and eventually explodes, and then the water boils off.

πŸ‘€hwillisπŸ•‘5yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

But isn’t the water that is being cycled from the reactor to the turbine reducing the temperature? Isn’t cooling the reactor throwing energy away?
πŸ‘€chrisseatonπŸ•‘5yπŸ”Ό0πŸ—¨οΈ0