Is there any way to calculate approximately how much electricity that
my low voltage outdoor lights are using? I am trying to figure out how
they are effecting my electricty bill. Right now I have four cables
(with about 4 lights on each) running about 3 hours a day.
Look on the lights and see what the wattage is for each bulb. If that is
not listed, see what the voltage and current is and multiply them together
to find the wattage for the bulb. In your case you seem to have 16 bulbs so
multiply the wattage of one by 16. This will give you the total wattage.
The transformer will be less than 100% efficiant, maybe about 80% so
multiply the total bulb wattage by about 1.2 to get the wattage used. Lets
say they are slightly over 10 watts per bulb. YOu have a total of about 200
watts and if you burn them for 5 hours a day you have used 1000 watt/hours,
or 1 kilowatthour. Then you multiply that by 30 for the number of days in a
month. This is 30 KWH. If the rate is $ .10 per KWH It looks like about $
3.00 per month.
Look at the label on the transformer box. Look for the INPUT Amps.
I have a 500W output Malibu timer/transformer and the label says: Input 120V @ 3.5A max.
Wattage is calculated by Volts * Amps so in my case 120 * 3.5 = 420 Watts.
420/1000 = 0.42 gives Kilo watt/ hours. From my electric bill the cost per
Kw/h is $0.063029 so it would cost me about 2.6 cents an hour to run mine at
full load. (0.42 * 0.063)
If you were only half loaded on the output then you would be using about
half of the input rating. in my case 210 Watts on the input. This would not
be totally accurate but would be close enough to give you and idea. Remember
that the transformer and timer do consume power even when the timer has
switched the lights off. So you would be consuming a few watts of power
HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.