Thanks a lot. I get part of it, but it hasn't sunk in to the level of
real understanding yet. I"ll read it a couple more times.
Sending an email of this to myself as well as you.
On Fri, 15 Jun 2007 22:16:33 +0000 (UTC), firstname.lastname@example.org (Dave
You should probably take those numbers with a grain of salt.
I really need to remeasure, now that I have a Kill-a-Watt
to easily give me real power. (at least for plug-in devices).
The wired in devices are definitely VA, as I used one of those
clamp around current probes rather than breaking the circuit.
(although the 40 watt AC outdoor unit was a resistive heater).
The plug in devices may be VA or Watts.
I will repost here once I have remeasured with real power numbers.
(don't hold your breath -- it may be a while).
I would encourage people to make their own measurements, as devices
vary greatly. I have noticed that newer devices are often much better,
as measured by the "how warm does it get?" method.
Dave: There is not really enough info/numbers in the original posted
question to calculate; but here is an attempt.
Answer: Very little.
Here's an assumption. If it use 100 watts while lap-top charging it
probably uses less than 10% of that while not charging. That is less
than 10 watts. And 10 watts is probably a bit of an overestimation any
Any more than that the charger alone would get slightly warm all the
time! Something like the amount of heat from one of those on-all-the
time night lights which IIRC are often 7.5 watts.
10 watts for one hour is one, one hundredth of a kilowatt hour. Using
my cost of electricity (yours may be more or less) of about 9 cents
(all charges included) per kilowatt hour, it will cost about 0.09
cents for every hour that it is plugged in but not charging anything.
Make that say one tenth of a cent; in other words it will cost around
2 to 2.5 cents per 24 hour day that it is plugged in and not charging.
Hardly worth bothering about?
A 100 watt light bulb left on for the same 24 hours would cost about
20 to 25 cents. Again depending on you electricity cost.
In regard to the cell phone charger.
You do not specify if the 0.2 amps is the input or the output.
Assuming it is the input:
Approx 120 x 0.2 = 24 watts. But again that would while charging.
That's probably less than one quarter of what the laptop charger
needs, cell phone is much smaller isn't it? While not charging; again
it probably uses less than one cent per 24 hour day.
This sort of question by those who are ignorant of electricity,
although the info. is usually on the label somewhere, and most of us
'did it in science class in school?"; reminds one of the little old
lady who used to go round plugging up her electric outlets "To stop
the electricity from leaking out"!
One item that doesn't seem to be realized with all this saving energy/
conservation business is that any lost heat from using less efficient
devices helps heat the house. We live in a cool area of North America
where every month of the year requires some heating, in our case
electric heating. We don't need or use air conditioning at all. So the
lost heat from much cheaper (about 25 cents each) non CFL light bulbs
etc. merely reduces electric heating! Our bathroom, for example, is
heated almost entirely by the six 40 watt bulbs above the vanity. Only
in coldest winter does the 500 watt baseboard electric heater operate!
For anyone connected to the North American power grid. On the other
hand, it might be worth unplugging for off-grid people who generate
their electricity by solar panels or windmill and store it in batteries,
since their per-kwh cost is likely to be many times higher.
There is loss in the field. But more modern electronic ones do not
require anywhere near as much. Recent cell phones I think are electronic
transformers. Those I tend to just leave plugged in, LED and all!!
It's probably hard to tell the modern ones from the older ones. They
all have cases made of plastic, which is the only clue I usually get
I've taken apart some of the almost cube=shaped plastic ones and all
they have inside is a metal core transformer and, rarely and for big
ones, a fuse wire. The big ones get less hot because they spread the
heat over more area.
Is 2 to 2.5 cents per 24 hour day not worth worrying about? 2.5 cents
is 9 dollars a year, times however many of these one has. Maybe 10? 90 dollars a year, plus 90 dollars of wasted electricity and fuel at
the electric generating plant, plus half of year as heat that people
use AC to remove, another 90 or 180 dollars.
Not that I unplug everything. It's easy enough to do so where the
receptacles are handy, but where they are behind the bed, or behind
the bookshelves, not so easy.
I think that air conditioning bit is exaggerated.
If you convert 90 dollars worth of electricity to heat, and half the
year you have to pump out the heat, that is 45 dollers worth of heat to
pump out per year.
Divide by the COP - which is (ideally) the EER divided by 3.41 (number
of BTUs in a watt-hour). COP may be somewhere around 3 or 4 in practice;
I would have to check that out better.
If COP is 3, then walwarts consuming $90 worth of electricity annually
in a home where it is air conditioning season half the year will add $15
to the electric bill.
Meanwhile, that 2.5 cents per day sounds a bit high. It appears to me
that a worse older type wallwart has idling losses around a watt or two,
based on heat output.
This is about .7 to 1.5 KWH per month. Even at Philadelphia residential
rate surcharged for use beyond some threshold during air conditioning
season, maybe 18 cents per KWH (IIRC), that is at most 27 cents per month
during air conditioning season. Without the surcharge, the per-KWH rate
including transmission fees and taxes is about 14 cents, for a maximum
around 21 cents per month.
However, I do think this adds up, especially when you have a lot of
Consider energy efficiency next time you are shopping for a fridge.
That can make a difference of a couple dollars a month.
If you have some really old fridge made in the 1970's or before that has
not died yet, find out how much electricity it is consuming, then
determine a rate of return from replacing it. There is some chance that
could exceed the long term rate of return of a good mutual fund,
especially considering that electricity costs are likely to increase
roughly with inflation in the next decade or two.
- Don Klipstein ( email@example.com)
HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.