220 V table saws and ground

Proof? This is high school physics.

If you disagree please be kind enough to show us some support instead of some vague "review a lamp catalog".

Reply to
J. Clarke
Loading thread data ...

The assumption that the resistance will remain constant is a bad one. As has already been pointed out elsewhere in this thread, the resistance of a light bulb varies with the temperature of the filament. A colder filament will have a lower resistance. A lower resistance will result in a higher current and a higher power. The actual power at 120 volts will be somewhere between the

85 watts that you calculated and the 100 watts that it would dissipate at 130 volts.
Reply to
Dan Coby

The standard number given is V^1.6. But the point is that a 100 watt bulb only draws 100 watts at the design voltage, it's not 100 watts at all voltages as Brainiac claims.

Reply to
J. Clarke

Please explain *CLEARLY* how increasing the thickness of any uniform substance can *increase* the resistance if everything else remains unchanged.

Hint: imagine a square wire of a fixed length, double its thickness and width, now explain to me the difference between that and four wires of the original thickness in parallel for 1/4 the resistance!

Reply to
IanM

Actually it isn't, it's temperature dependant. The resistance goes up with increasing temperature so that as the voltage increases the dissipated power doesn't increase in a linear way.

Reply to
Stuart

No, that is the US standard.

Reply to
keithw86

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D

Who said anything about a fixed length?

Reply to
keithw86

=A0=3D 13*13 =3D 269 ohms hot.

or 27 ohms for surges).

=3D 53.53 watts.

You assume that the temparature, thus the resistance, of the filament is the same at 130V as it is at 120V. This is certainly *not* true. At 120V, the lower filament temperature not only will the bulb use less power (though less than expected using your calculations) will make the bulb less efficient (lumens per watt), costing you money too.

Much longer, yes. Bulb life is a function of something like the 16th power of service voltage. It's still not saving money, unless there is a cost associated with replacement in addition to the bulb cost.

Reply to
keithw86

surges).

And the cost of replacement must be huge. 1000 hours of use of 100 W bulb is going to use 100 kwh, I pay something like 11 eurocents per kwh so energy is going to cost me something like 11 euros. Higher voltage filament bulbs would easily cost several times more for same light output. I am moving to led lights myself (not for energy efficiency, fluorecents are about same but for longer life). Now testing this:

formatting link
malm

Reply to
Seismo R. Malm

snipped-for-privacy@gmail.com wrote: ...

There's the same fallacy assumption that Lew made as well -- _ONLY_ if one is requiring the same or more lumens will there be a higher energy cost to obtain them--as told Lew, for household lighting, a 100W bulb is a 100W bulb and one gets the light one gets (at least that's what I do). It's good enough and bulbs last.

Sure it's not much for an ordinary 100W bulb so the convenience of not having to replace them is a factor but there's no economic penalty associated w/ gaining that (again, assuming one doesn't go from 75W _to_

100W per bulb).

--

Reply to
dpb

Thank you. My sub-panel is in a barn 80 feet from the house panel. I ran a solid copper (#8?) to the ground stake just outside where the sub is mounted. The connection runs under ground (in HDPE pipe) along with a 10Bast-T and a Coax.

On the suggestion (elsewhere): "Why not just take out the short 2 wire cord and throw it away, and simply attach the long 3 wire to the saw? "

I would say one needs to watch out that the longer "extension" cord is of a suitable gauge as many tools come with a minimum gauge "pigtail" which is "OK" if plugged directly into a suitable (amp-wise) outlet, but not if run through one of those 16 gauge extension cords - espeacially when they are twenty-five feet and more.

If you do re-wire with a longer cord, use at least 10 Gauge wire with a ground (IMHO) to get the most power out of your tool. I use 20AMP cords if "extending" to a Table Saw and the like. I've noticed severe slowing down/loss of power when using lighter cords and the cord (esp at plug end) get nice and toasty.

Reply to
Hoosierpopi

From GE 2006 large lamp catalog

100 A 130V 100 watts 750Hrs 1680 lumens 100A 130V@120V 89 watts 1950Hrs 1275 lumens GE shows rated watts as 89 at 120V.

Depends whether your more concerned about light level or life of the lamp. Since there are so many better options these days it seems pointles to even use them.

Mike M

Reply to
Mike M

No fallacy at all. Need less light? Use a lower wattage, or fewer bulbs.

That's a big assumption. The fact is that we use light to see.

Reply to
keithw86

That's a different situation. It's not just a sub panel.

Well, ya. Cords must be rated for the current drawn. 16ga is good for 13A, IIRC. I think the only 16GA extension cords I own are used only for lights. For (hand) saws I use only 12GA, even only 25'.

12GA is fine. It's no different than the wiring in the house. A foot of 12GA in the wall is the same as a foot of extension. Yes, if the total run is too long, half of it in 10GA will help. I replaced the cord on my Unisaw with 15' of 12-3 SJ. There's probably 50' of 12-2 w/ G going back to the sub-panel from the wall. Changing the 15' from 12ga to 10ga isn't going to change anything. The saw starts with authority now. ;-)
Reply to
keithw86

snipped-for-privacy@gmail.com wrote: ...

Well, I can assert that in my case (the only one that actually matters to me :) ) it's not an assumption at all. I see fine using the same wattage-rated bulb in 130V version as the 120V and as long as that is so it's a win if they last longer...

If you or another finds that isn't the case, you'll/they'll have to handle it however you/they choose but that wouldn't negate my usage patterns nor increase my cost (which was the erroneous claim being made).

--

Reply to
dpb

The resistance varies inversely to the cross-section of the conductor.

AWG 12 wire resistance/foot = 1.619Ohms. AWG 10 wire resistance/foot = 1.018ohms. AWG 8 0.6405

formatting link
heavier wire, less resistance.

So assume that a 100watt blub rated at 130V filament consumes

0.769231 amperes of current. From ohms law, one can then derive the resistance of the conductor as (R=V/I) 156 ohms.

Now run that same bulb at 120volts, the current in the filament (per again ohms law) will be (I=V/R) 0.769231 (i.e. the same current).

However, the power consumed (P=IV) will only be 92.3 watts, thus reducing the lumen output of the bulb.

Reply to
Scott Lurndal

Then why don't you use a 60W in stead of a 100W, for example?

You're simply fooling yourself.

Reply to
keithw86

...

I hadn't looked up specific numbers; I only used the fact that the power actually used is what controls the operating cost and that bulbs are rated for their power consumption at the stated voltage. Hence, the variability between an ideal 120V and our typical higher voltage that is still rarely as high as 130V will cause the power consumption to be less than it would otherwise be albeit w/ a loss of lumens altho I really don't think it's terribly noticeable unless the lighting already was marginal.

Anyway, assuming the 1.6 exponent, the reduction factor would be 0.88 instead of 0.85 according to my trusty HP-97. In reality, altho I've never monitored it for a period of time (altho come to think I do have sufficient test gear I could; just never thought of doing so as doesn't really make any difference as it is what it is and will continue to be so) I'd guess our average would be around 124/125 based on the numbers I generally have noted when did measure. So, would be less than that in practice but it _won't_ be >1.0.

--

Reply to
dpb

No a 100W bulb is *not* a 100W bulb. Look at the rated output of the bulbs at the given voltage. You generally buy a light bulb for light (lumens) not heat (watts). If you have excess light use a smaller bulb.

You assume that a 100W 130V bulb puts out *exactly* the light needed and that no less will do. Bad assumption.

Reply to
keithw86

---------------------------------------------- When it comes to an incandescent lamp filament, there is more than just the geometry (length and cross section)of the wire at work to determine the filament resistance.

Coatings on the wire, shape of the winding, the metal alloy are just a few of things that come into play to arrive at the final design.

For a given wattage of lamp, the total lamp resistance of the lamp increases in direct proportion to the rated voltage.

Lew

Reply to
Lew Hodgett

HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.