OT: Household stanby power consumption

I see that the figure of 10% is still being bandied around as the average
household power consumtion due to devices left on standyby. (Energy saving
trust)
This has got to be total b*llocks hasn't it? Was it ever true? I've sure
I've read about it being debunked somewhere. Anyone know where I might find
a more reliable figure?
Tim
Reply to
Tim Downie
The government reckons about 10%, Wikipedia says Americans use about 5% but the University of Strathclyde
formatting link
has a figure of 13%.
About 10% is therefore probably a reasonable overall average.
Reply to
Old Codger
them powered down?
So by their logic, if 85% of my bill is £30.00, my whole bill should only be £35.29 ...
Reply to
Andy Burns
Yes, when Which? reported on measurements of standby power of TVs they stated that the typical consumption of a TV on standby was 0.2W. When switched on the TV used 100W. So standby consumed 0.2% of full-on, not 10%.
Energy Star, which most devices conform to, mandates 1% maximum on standby.
No, the highest real world figure that I have seen quoted came to 5%.
Still the watermelons don't need objectivity, they have Gahd on their side.
Reply to
Steve Firth
There was some clown on the radio the other day saying something like "It's important to lead a green lifestyle but it's the little things that count: It's alright to be a total petrolhead as long as you remember to switch your phone charger off at night."
Tim w
Reply to
Tim W
In article , "Tim Downie" writes:
I think all my products are well under 1W standby. The only things I have which aren't are remote switches intended to save energy by switching products off rather than standby, and these consume over 1W each, i.e. much more than the standby power they're saving. (I'm actually using them for something else, or I would have chucked them out or returned them as unfit for purpose.)
One of the worst items you are still going to find are set-top boxes, which have notoriously high standby consumption (often it is no different from full on power consumption). I don't have one.
If you have a TV or monitor older than about 12 years, they can have standby consumption of 8 or so watts, and older than 8 years could be a couple of watts. Any newish TV or monitor will be much less than 1W.
Some wall warts can consume quite a bit, and for older transformer based ones, this is no less when there's no load. Modern switched mode ones tend to be very good, particularly mobile phone ones.
I struggle to think that 10% would be true.
Reply to
Andrew Gabriel
I suppose it depends on what you mean standby usage... for example do you include things like fridges/freezers, the central heating, or night/security lights or other stuff that you never turn off into the mix. That might add a fair deal to the 24/7 consumption and yet be nothing to do with phone chargers, and TVs etc showing little red lights etc.
Reply to
John Rumm
Andrew Gabriel :
Indeed and there's a very simple test. If it doesn't feel warm, it's not wasting appreciable energy.
Reply to
Mike Barnes
In article , snipped-for-privacy@cucumber.demon.co.uk says...
Ours drank power - until I installed a switch so it and everything else (telly, DVD, 2xPC, 2xmonitor, printer, HiFi, 2xlamps with wall-warts, 2xPC speakers, modem, router, wireless bridge) all go off a night.
That was well worth it - the power meter shows 110W on standby. And, of course, nothing when the switch is off for ten hours overnight.
Reply to
Skipweasel
I agree My older SKY digiboxes showed no difference in power consumption between ON and Standby - so much for the auto standby setting. However a far newer SKY digibox was worth putting into standby . Trouble with switching these things off is that they dont power up to the same state or channel - especially SKY boxes.
Of course all this energy isnt totally wasted at this time of year - it saves the central heating working quite as hard !
Reply to
robert
keep them powered down?
Those sorts of statements aren't meant to stand rigorous scrutiny (i.e. be considered by a numerate persion). They are meant to appeal to the 1 ... 2 .... many crowd on an emotional level.
If people do try arguing with you, using that kind of "data", just respond by saying "I'm an iconoclast, it doesn't apply to me". They'll probably think that's some sort of religion and will change the subject very quickly in case they offend your beliefs :)
Reply to
pete
On Feb 2, 12:27=A0am, snipped-for-privacy@cucumber.demon.co.uk (Andrew Gabriel) wrote:
Iron lump warts consume a lot less on no load than full load, just like any transformer. Power consumption doesn't drop down to anywhere near 1% off load, but the total consumption of these old warts in a household is miniscule.
NT
Reply to
Tabby
On 2 Feb,
Computers can be notorious. My son's took 26W on standby, as compared to the 35 watts my server in the garage takes when running. His TV,moniter and radio only took 2W in total. Before anyone comments, these figures are watts, not VA.
I've now managed to persuade him to switch it all off at the wall when not in use.
Reply to
<me9
On 2 Feb,
That's exactly how I assess my wall warts. It's not as good a test on larger equipment though. My printer took as much as my hottest wall wart, but wasn't noticibly warm.
Reply to
<me9
On 2 Feb,
The ones here are much less efficient on low load than switched mode ones. Noticibly warmer and higher reading on wattmeters. It's a pity switched mode ones usually chuck out so much electromagnetic rubbish instead of heat.
Reply to
<me9
Yes, but only if you integrate the times its in use/standby.
A phone charger is charging the phone for about two hours a week, the rest of the week its doing nothing but wasting energy. Therefore you need to multiply your standby power by about 50 to get the actual waste.
TVs are probably used for a few hours a day so the waste is actually less than a phone for modern sets.
Reply to
dennis
But if the device spends 10 times longer on standby than in use, the energy_used_on_standby/energy_used_when_on figure can easily be 10%. E.g., if you watch 2.4 hours of TV a day...
#Paul
Reply to
news10paul
Yes, approximately, but if you're trying to argue that because your whole bill is in fact substantially more than that, then what they claim must be wrong, then it's actually *your* logic that's flawed.
There's nothing wrong with their *logic* as such. What's wrong (I suggest) is their *premise*, i.e. the 85% figure does seem ridiculously high.
Let's just run with the ridiculous 85% figure for the moment (you can always substitute a more reasonable figure later). Let's say that the average standby-capable device spends 3 hours a day in use, and 21 hours on standby. It therefore uses a long term average (3*100% + 21*85%)/24 or 86.9% of full power. If instead you switched it off completely when not in use, it would use a long term average of 3*100%/24 or 12.5% of full power. The saving is 74.4%, and if this corresponds to £30 a year, then your whole annual bill should actually be £40.34.
But remember, this is only in respect of those devices which have a standby capability. You do also power other stuff!
Reply to
Ronald Raygun

Site Timeline Threads

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.