I see that the figure of 10% is still being bandied around as the average
household power consumtion due to devices left on standyby. (Energy saving
This has got to be total b*llocks hasn't it? Was it ever true? I've sure
I've read about it being debunked somewhere. Anyone know where I might find
a more reliable figure?
Yes, when Which? reported on measurements of standby power of TVs they
stated that the typical consumption of a TV on standby was 0.2W. When
switched on the TV used 100W. So standby consumed 0.2% of full-on, not
Energy Star, which most devices conform to, mandates 1% maximum on
No, the highest real world figure that I have seen quoted came to 5%.
Still the watermelons don't need objectivity, they have Gahd on their
There was some clown on the radio the other day saying something like "It's
important to lead a green lifestyle but it's the little things that count:
It's alright to be a total petrolhead as long as you remember to switch your
phone charger off at night."
In article ,
"Tim Downie" writes:
I think all my products are well under 1W standby.
The only things I have which aren't are remote switches intended to
save energy by switching products off rather than standby, and these
consume over 1W each, i.e. much more than the standby power they're
saving. (I'm actually using them for something else, or I would have
chucked them out or returned them as unfit for purpose.)
One of the worst items you are still going to find are set-top boxes,
which have notoriously high standby consumption (often it is no
different from full on power consumption). I don't have one.
If you have a TV or monitor older than about 12 years, they can have
standby consumption of 8 or so watts, and older than 8 years could
be a couple of watts. Any newish TV or monitor will be much less than
Some wall warts can consume quite a bit, and for older transformer
based ones, this is no less when there's no load. Modern switched
mode ones tend to be very good, particularly mobile phone ones.
I struggle to think that 10% would be true.
I suppose it depends on what you mean standby usage... for example do
you include things like fridges/freezers, the central heating, or
night/security lights or other stuff that you never turn off into the
mix. That might add a fair deal to the 24/7 consumption and yet be
nothing to do with phone chargers, and TVs etc showing little red lights
In article ,
Ours drank power - until I installed a switch so it and everything else
(telly, DVD, 2xPC, 2xmonitor, printer, HiFi, 2xlamps with wall-warts,
2xPC speakers, modem, router, wireless bridge) all go off a night.
That was well worth it - the power meter shows 110W on standby. And, of
course, nothing when the switch is off for ten hours overnight.
My older SKY digiboxes showed no difference in power consumption between
ON and Standby - so much for the auto standby setting.
However a far newer SKY digibox was worth putting into standby .
Trouble with switching these things off is that they dont power up to
the same state or channel - especially SKY boxes.
Of course all this energy isnt totally wasted at this time of year - it
saves the central heating working quite as hard !
keep them powered down?
Those sorts of statements aren't meant to stand rigorous scrutiny (i.e.
be considered by a numerate persion). They are meant to appeal to the
1 ... 2 .... many crowd on an emotional level.
If people do try arguing with you, using that kind of "data", just respond
by saying "I'm an iconoclast, it doesn't apply to me". They'll probably
think that's some sort of religion and will change the subject very
quickly in case they offend your beliefs :)
On Feb 2, 12:27=A0am, email@example.com (Andrew Gabriel)
Iron lump warts consume a lot less on no load than full load, just
like any transformer. Power consumption doesn't drop down to anywhere
near 1% off load, but the total consumption of these old warts in a
household is miniscule.
On 2 Feb,
Computers can be notorious. My son's took 26W on standby, as compared to the
35 watts my server in the garage takes when running. His TV,moniter and radio
only took 2W in total. Before anyone comments, these figures are watts, not
I've now managed to persuade him to switch it all off at the wall when not in
On 2 Feb,
The ones here are much less efficient on low load than switched mode ones.
Noticibly warmer and higher reading on wattmeters. It's a pity switched mode
ones usually chuck out so much electromagnetic rubbish instead of heat.
Yes, but only if you integrate the times its in use/standby.
A phone charger is charging the phone for about two hours a week, the rest
of the week its doing nothing but wasting energy.
Therefore you need to multiply your standby power by about 50 to get the
TVs are probably used for a few hours a day so the waste is actually less
than a phone for modern sets.
Yes, approximately, but if you're trying to argue that because your whole
bill is in fact substantially more than that, then what they claim must be
wrong, then it's actually *your* logic that's flawed.
There's nothing wrong with their *logic* as such. What's wrong (I suggest)
is their *premise*, i.e. the 85% figure does seem ridiculously high.
Let's just run with the ridiculous 85% figure for the moment (you can
always substitute a more reasonable figure later). Let's say that the
average standby-capable device spends 3 hours a day in use, and 21 hours
on standby. It therefore uses a long term average (3*100% + 21*85%)/24 or
86.9% of full power. If instead you switched it off completely when not
in use, it would use a long term average of 3*100%/24 or 12.5% of full
power. The saving is 74.4%, and if this corresponds to £30 a year, then
your whole annual bill should actually be £40.34.
But remember, this is only in respect of those devices which have a
standby capability. You do also power other stuff!