A power plant produces electrical energy at the rate of 1300 MW with an efficiency of 0.25. The excess heat is dumped into a river that carries 1500 cubed meters per second of water...how much does the river's temperature increase? Help me out cuz I'm stuck on this issue.
-
First find the amount of the dumped heat. If the efficiency is 25 % then 75% is dumped.
This is three times the amount of the delivered power so the plant must dump 3900Mw
so 3.9 * 10^9 J/s
The specific heat of water is approx 4.2 J /deg/cm^3.
= 4.2 * 10^6 J/deg/m^3
So the temperature rise is the amount of heat divided by the amount of water divided by the specific heat.
=3.9 * 10 ^9 /(1500 * 4.2 * 10^6)
=0.62 deg.
This is three times the amount of the delivered power so the plant must dump 3900Mw
so 3.9 * 10^9 J/s
The specific heat of water is approx 4.2 J /deg/cm^3.
= 4.2 * 10^6 J/deg/m^3
So the temperature rise is the amount of heat divided by the amount of water divided by the specific heat.
=3.9 * 10 ^9 /(1500 * 4.2 * 10^6)
=0.62 deg.