Please forgive me while I troll for a moment.....
Is it energy saving to turn the thermostat down, when
leaving the house? I mean, the furnace has to run to catch
up when I get home. I have a way of looking at the matter.
I'll explain my point of view after the argument is
Imagine for a minute that you have to leave house for a month.
Would it be energy efficient to turn thermostat down? Of course, as
less heat will be produced for a whole month, with only a few minutes
to catch up.
The same applies to only one day.
Turn it down. A lower temp in the house means a slower loss of heat to
the outside. End of story.
"Catching up" is just getting the house hotter so it can loose heat
Storm> Please forgive me while I troll for a moment.....
You start saving energy as soon at the house temperature stabilizes at the
lower temperature. Except for very short times, when you let the temperature
drop and then immediately ramp it up again, you always save energy by
lowering your house temperature.
Here's what the DOE says about it:
"A common misconception associated with thermostats is that a furnace works
harder than normal to warm the space back to a comfortable temperature after
the thermostat has been set back, resulting in little or no savings. This
misconception has been dispelled by years of research and numerous studies.
The fuel required to reheat a building to a comfortable temperature is
roughly equal to the fuel saved as the building drops to the lower
temperature. You save fuel between the time that the temperature stabilizes
at the lower level and the next time heat is needed. So, the longer your
house remains at the lower temperature, the more energy you save."
It's far more complicated than that. Factors such as insulation / heat
loss, type of heating, multi-stage heating, electric backup heat on heat
pumps, etc. all come into play in determining the away duration and temp
reduction where savings begin, and in some cases (typically high
efficiency homes) it can require a multi day absence to see any savings.
Correct - whatever the net effect of insulation is, there is a net negative
heat flux from the house to the outside. The flux is proportional to the
temperature difference (the exact equation will depend on the radiation,
convection and conduction components - radiation alone is governed by the
Stephan-Boltzman equation). The larger the difference the greater the flux.
Averaged over any period of time, any time spent with the thermostat set
lower will yeild a lower internal temperature, hence less heat flux.
Whether that is enough to show up in your bill is another question, but from
a energy savings point of view, it is incontestible.
True. Yet I still hear this type of "reasoning" all the time. Should be a
simple concept even for the technically challenged, for example, people who
argued here that you can compress air and allow it to expand (while doing no
useful work) with no loss of energy.
The confounding issue, though, is the thermal mass of the house. That's why
the DOE explanation says that the savings occur when the temperature inside
the house has stabilized at the lower temperature.
When you shut off the furnace, the thermal mass of the inside of the house
is what's giving up heat to the outside. That's stored energy that came from
the furnace heat. When you raise the temperature, you have to restore that
heat to the thermal mass. So with the furnace off and the temperature inside
of the house dropping, you're losing stored heat. When you turn the
thermostat back on, you have to restore that lost heat, which will also heat
up the atmosphere inside of the house (which is a very small portion of the
total inside thermal mass).
That's what I read from their description, anyway, and it comports with
things I've read about it from other sources. There is no (theoretical) net
gain or loss when the thermal mass is put through the cycle of cooling down
and heating up. The savings occur when the temperature is reduced and
This all assumes that a house is decently insulated and that the thermal
mass of the house is substantial. Of course, the thermal differential
between the inside and outside temperatures are always at work, suggesting
that there is less heat loss with each degree of reduction of inside
temperature, as you say. But the DOE's reference to actual testing agrees
with the fact that, as soon as you turn the thermostat down, you begin
losing *stored* heat, and when you turn it back up, 100% of that lost heat
must be restored, regardless of actual thermal losses through the walls and
But it has to be done *awfully* quickly. That's why there's a minimum
cylinder size for diesel engines -- something like 300 cc. Below 3,000 rpm
or so, the compressed air cools too quickly to ignite the fuel. And heat
transfer gets worse as compression goes up.
No, it isn't. It was documented on a well monitored high efficiency
model home where the backup heat strips on the high efficiency heat pump
were kicking in in order to provide a reasonable temp recovery time
since the heat pump itself did not have the capacity. The electricity
used during the temp recovery was more than would have been used on temp
maintenance due to the switch to lower efficiency backup (100% vs.
But that only tells you that a lower-efficiency temperature-recovery system
is...lower in efficiency. If you have that particular pair of heating
systems, you have one situation. If you have a more-typical single heating
system, you have quite another.
In the case you've described, you aren't dealing just with the
thermodynamics of the situation. You're also adding the complexity of
multiple heat sources that operate under different circumstances.
What you have there *is* a misconception in that it does not account for
multi stage / mixed technology heating systems which are not that
uncommon. A good example is a high efficiency heat pump with backup heat
strips. Depending on the controls, such a heat pump may engage the
backup heaters when it is unable to produce an acceptable rate of temp
rise with just the heat pump, and this switches the effective efficiency
from 300%+ to 100%, making it more costly to bring the temp back up to
normal than it would have been to maintain it at normal. This situation
was documented on a high efficiency model home.
That isn't a "pair of heating systems", nearly all heat pumps include
backup heat strips for times when the heat pump is not able to produce
enough heat such as very cold weather / high demand.
That complexity exists everywhere and that was my point - you have to do
the actual analysis of the home in question to get the correct answer -
you can't rely on blanket statements / myths.
An additional complication is occupancy, since for folks who are retired
or work from home, or a stay at home spouse, you loose half or more of
your theoretical savings period with the occupants not being away during
No, there is quite an element of truth! You have to compare the thermal
mass of the house and the heat loss. If the house had enormous thermal
mass, like lots of stone floors and massive stone fireplaces (some
people build houses intentionally to have very high thermal mass) then
although the furnace could warm the air quickly to make you comfortable,
it would still run for hours to warm up all that mass. If you have a
lot of thermal mass and low heat loss (good insulation) then turning the
thermostat down for a couple hors gives no benefit. If you have low
thermal mass and lots of heat loss, then turning it down for even a
couple hours will give significant benefit.
Yes, if the house has only dropped a couple degrees when you get back,
then there's little benefit. if the house cools rapidly to the lower
temperature and stays there for, say, 7 hours before you return, then
you get a benefit. Of course, if your house cools off very quickly,
then you might do best to invest in insulation.
We had a big ice storm a couple years ago, and found we could be
moderately comfortable for about 8 hours before firing up the generator
to bring the furnace online. I think that means our insulation is doing