Baseload Generators: Relationship between output and cooling water

Hi,
I'm just wondering if anyone is aware of any formulas that could
describe the relationship between the cooling water and the output of a
power plant (baseload such as coal-fired generator)?
I understand that during periods where the cooling water is rationed
(due to drought, etc), power plants will have to reduce their output,
but I'm not sure how much they would have to reduce, etc. There's also
a relationship between the temperature of the cooling water and the
output.
Any ideas would be welcome. Thanks.
Reply to
thampw
Loading thread data ...
| I'm just wondering if anyone is aware of any formulas that could | describe the relationship between the cooling water and the output of a | power plant (baseload such as coal-fired generator)? | | I understand that during periods where the cooling water is rationed | (due to drought, etc), power plants will have to reduce their output, | but I'm not sure how much they would have to reduce, etc. There's also | a relationship between the temperature of the cooling water and the | output.
It would be reduced to the point where the available or decided water flow rate can still keep the generator running at the specified frequency. The water flow would generally be a function of the generated power. There may be reasons to avoid using the hydro power at that time to keep it in reserve, or to keep the water in reserve. These are complex and often political decisions. Other natural energy sources such as wind power do not face those kinds of decisions, though clearly they do face variations in how much power is available.
Reply to
phil-news-nospam
If we assume a typical thermal efficiency of say, 40%, then take the MW electric output and divide by 0.40 to get boiler output. Multiply that by (1-0.40) to find out how much heat is rejected to the cooling water (if you've never done this before, you might be surprised, it's a *lot*).
In many jurisdictions, there are multiple issues when water becomes scarce. First, you have to pump a lot of water through the condenser to maintain vacumn for the turbine. If you cannot maintain vacumn the turbine will trip automatically. Turbines designed to operate at a certain level of back-pressure don't like operating at a higher back-pressure (lower condenser vacumn/ higher exhause temperature). Several things, mostly related to moisture levels in the exhaust.
In an 'open-loop' system (use river water or lake water directly) there is a second concern is the 'delta-T'. Many states (if not all), issue a discharge permit so the plant can return all the cooling water that it used, back to the river/lake. Part of that permit requires that the water being returned not be more than XX degrees warmer than the river/lake. A typical limit is 30 F warmer than supply. They may also have a maximum, such as "no warmer than 100F." So if you're pumping water from the lake through the system as fast as you can, and you have 28F rise at full power, yet the summertime lake temperature rises to 75F, you have to reduce power so your delta-T is only 25F (making the outlet 100F).
In 'closed loop' cooling, the cooling is achieved by recirculating water between the condenser and the cooling tower. The tower cools the water by evaporation, and that water must be replaced with water from somewhere (river/lake/wells). If levels are low, or you can't get replacement water, you have to reduce power to reduce evaporation rate from the tower. If the cooling tower 'fill' material is in bad shape, you may lose a lot of water to 'drift' and more make up. And to keep the minerals from building up too high (evaporation leaves all the minerals in the water behind), you need to 'feed-and bleed' the tower constantly, discharging the more concentrated stuff back to the river/lake/somewhere.
To put things in perspective, an 800MWe plant @40% efficiency rejects about 1.137e6 BTU/second. In a cooling tower, that's evaporating water at about 1034 lbm/second. That's about 7450 gallons every minute *before* considering drift or 'blowdown'. That same plant may actually need between 10,000 and 12,000 gallons every minute to operate at full power.
If it's an 'open loop' plant using a river/lake directly, with a 30F delta-T limit on their discharge permit, that 1.137e6 BTU/second being rejected to the cooling water would require cooling water flow rates of about 275,000 gallons/minute.
As you can see, thermal power plants are 'thirsty' creatures.
daestrom
Reply to
daestrom
Thanks daestrom, that is exactly what I was looking for.
Cheers!
Reply to
thampw

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.