# (Another) Wiring Question

Page 1 of 2
• posted on December 8, 2006, 4:39 pm

I need to install baseboard electric heating units in two rooms I'm refurbishing. One will need a 48" 2000 watt unit; the other a 36" 1500 watt unit.
The instructions in each box say that I will need to use a 220v circuit with "amperage according to local code".
Both rooms are next to each other and I'm wondering if instead of fishing two wires, I can go with one 10/2 30amp circuit and have both units branching off the main line. What might be the minimum wiring and amp circuit? What would be safest? What's the most amperage I can get out of a 10/2 line?
Related question: I've got tons of 12/3 wiring with ground laying around unused. Can I turn this into, say, 10/2 by simply clamping the black and red wires together at the panel and at the end point and then painting the red wire with black marker pen to indicate power? I hate to waste wire with the cost of copper these days.
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 4:58 pm
46erjoe wrote:

Nope
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 9:29 pm
Basic Electricity 101: current flows in a circuit, and needs to return back to the source (in this case let's call it a breaker box) along a 10 gauge wire as well. Thus, by "clamping" red and black together you get a heavier conductor only in one direction (from the breaker box to the load) but still have a light conductor in the return path. You would need to start with 12/4 and "clamp" 2 pairs together to do what you want.
Smarty

<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 10:50 pm

Right so far...

But that's a violation of the National Electrical Code.
--
Regards,
Doug Miller (alphageek at milmac dot com)
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 5:24 pm

220, or 240? It makes a difference. If the heater is rated 2000 watts at 220V, it will produce almost 2400W at 240V -- which is almost certainly the voltage that you actually have in your house. And never mind the installation instructions. Look at the rating plate on the heater.

You'd have to talk to a local electrical inspector to find out what your local code is.

Under the National Electrical Code, 10/2 is limited to 30A overcurrent protection (breaker or fuse), and continuous loads (such as electric resistance heating) are limited to 80% of the overcurrent rating -- which would be 24A for a 30A breaker. 24A at 240V is 5760 watts; your heaters total 3500, so one 10/2 30A circuit will be just fine.

NO. First off, that's a Code violation: connecting conductors in parallel is not permitted. Second, even if that was allowed, that would take care of only *one* of the two conductors in the circuit anyway. What about the other one? It would still be 12ga. And don't even think about doubling up the white and bare wires -- you could wind up making the case of the heater live.
However, if the heaters are rated 2000 and 1500 W at 240V, you don't need a 30A circuit anyway, and you can use 12ga wire: 20A * 240V * 80% = 3840 W, which is adequate for the heaters you have.
OTOH, if they're rated 2000 and 1500 W at 220V, and you run them on 240V, then you will need a 30A circuit, because the heaters will produce almost 20% more power: 2380 and 1785 watts respectively, for a total of 4165 watts -- too much for a continuous load on a 20A circuit.
This is why I asked above if they're rated at 220V or 240V. It does matter.

That would be unnecessary -- red is assumed to be power anyway -- but as noted above, it's a Code violation, and it's not safe.

It's come down quite a bit since June. It's still more than double what it was two years ago, but I saw 250' of 12/2 NM at Home Depot last week for \$67... and just a few months ago, it was over \$100.
--
Regards,
Doug Miller (alphageek at milmac dot com)
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 6:25 pm
46erjoe wrote:

Well I can't think of a violation if you would cut the black or red wire at each end, or cap it, but not use it in any part of the circuit. I am not sure about that however. Of course you would not have 10/2 you would have 12/2.
Best bet is to find someone with extra 10/2 and work out a trade.
--
Joseph Meehan

Dia \'s Muire duit
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 6:47 pm
On Fri, 08 Dec 2006 18:25:10 GMT, "Joseph Meehan"

Unless you're only serving one device, 2 12-AWG wires 240V apart at 20Amps will deliver more watts to the other end than one 10-AWG wire at 120V and 30Amps.
And the 12-AWG will probably even fit on the screw-terminals.
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 6:57 pm

So what's your point here? The OP has 240V heaters. The notion of a 120V 30A circuit was never being discussed.
--
Regards,
Doug Miller (alphageek at milmac dot com)
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 8, 2006, 6:53 pm

Yes, and that might in and of itself be a violation, as I explained in an earlier post -- he might *need* a 30A circuit.
--
Regards,
Doug Miller (alphageek at milmac dot com)
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 9, 2006, 2:03 am
061208 1139 - 46erjoe posted:

If the heaters are not too far from the panel, say more than 50 feet of wire, you can use the 12 gauge wire, using just one circuit for both heaters.
2000/220 = 9.1 Amps 1500/220 = 6.8 Amps 9.1 + 6.8 = 15.9 Amps
Derating the 20 gauge wire to 80% would be 16 Amps.
Don't even think about paralleling the wiring. If the voltage drop is severe, then consider running two of the 3-wire cables -- one for each heater.
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 9, 2006, 3:34 am

Not so fast. If the heaters are rated at 220V as he stated (not 240V) but his service is actually 240V (as is very likely), the currents will be almost 20% higher (10.8 and 8.1 amps, respectively) for a total of 18.9 amps, requiring a 30A circuit because...

You mean 12 gauge / 20 amp, of course.
--
Regards,
Doug Miller (alphageek at milmac dot com)
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 9, 2006, 2:44 pm
061208 2234 - Doug Miller posted:

Ooops! Yes...
Maybe he should check his line voltage to be sure...
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 1:14 am
Doug Miller wrote:

Think about what you are saying, Doug. A 2000 watt resistance heater stays a 2000 watt heater no matter what the voltage applied. Same for the 1500 watt heater. The resistance stays the same. Only the amps and volts are variables. The higher the volts, the lower the amps. Indago's calc was correct, as he figured it on the (worst case) lower voltage. One 20 amp circuit is sufficient for both baseboard heaters.
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 1:34 am
volts500 wrote:

If it's a pure resistance element, then no, that's not true. The *RESISTANCE* stays constant; watts and amps increase with volts. V=IR and all that. Of course I know we're talking about AC circuits here so that will only be an approximation, but you get the general idea.
nate
--
replace "fly" with "com" to reply.
http://home.comcast.net/~njnagel
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 2:13 am

That is *not* correct.
The *resistance* stays the same. Increase the voltage, and you increase the current. Increase the current and voltage, and you increase the power even more.
Suppose you have a 10ohm resistance heating element. Operate it at 220V. V = IR I = V / R = 220V / 10 ohm = 22 amps. E = VI = 220V * 22A = 4840 watts.
Now operate the same element at 240V: I = V / R = 240V / 10 ohm = 24 amps E = VI = 240V * 24A = 5760 watts.

Right. The resistance stays the same. But the power output changes with voltage.

False. The higher the volts, the *higher* the amps.
V = IR
Increase V while holding R constant, and I goes *down*??? Hardly.

Wrong. The worst case would be figuring it at the higher voltage.

Not if they were rated at 220V but operated at 240V.
If they were rated at 240V, yes -- but that's the whole question here, isn't it? The installation instructions cited by the OP apparently referred to installing a 220V circuit, which raises the possibility that they heaters were rated at 220V and not at 240V. Operating them at a 9% higher voltage than that at which they were rated will cause them to draw 9% more current, and deliver nineTEEN percent more power, than rated.
--
Regards,
Doug Miller (alphageek at milmac dot com)
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 2:48 am
061210 2113 - Doug Miller posted:

This is correct, and at this point I would connect a temporary line to the panel and place the heaters on a bench and turn on the power and check the voltage and current with a clamp-on ammeter at first surge, and then when the heater heats up, check the current again to get an accurate measure of just what would be demanded of the circuit. I am assuming the baseboard heaters are of the calrod class of heater, and the hotter the element gets, the more the resistance, and the lower the current.
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 5:33 am
Doug Miller wrote:

My mistake, Doug, you are absolutely correct.
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 3:53 am

What actually causes this to happen? The current through a constant resistance will increase as voltage increases.

--
15 days until the winter solstice celebration

Mark Lloyd
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 11, 2006, 11:12 pm
volts500 wrote:

P=IE Power = Current * Voltage
Since the resistance effectively stays the same ( not exactly, as the higher voltage will create a little more heat, raising the resistance very slightly, we CAN ignore the change for this example ), when the voltage goes up, the current must also go up for the first equation to stay in balance. Also, by inserting the first equation into the second, we get P = E squared / R. So a 2000W heater at 220V has a 0.04132 ohm resistance. ONLY the resistance can be treated as a constant, as it is the only physical thing that doesn't (effectively) change. So at 240V, the POWER becomes 240 * 240 / 0.04132 or approximately 2380W, and the current will be 9.9A, not 9.1. Since US standard line voltages are usually closer to 240 than to 220 in real life, using 240 for the calculations is the safest and most appropriate method. Regardless, as voltage rises in a circuit, if all physical constructs remain the same, the current MUST rise, and so will the power.
All of this goes out the window in other countries, power grids, etc. Take with a grain of salt, as this was based entirely on the laws of physics, and we all know that science is completely out of favor in the US.
Husky
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
• posted on December 9, 2006, 2:38 am
wrote:

You need to know the Full Load Amps. If you have the specs for the heaters it should be listed. If it is less than 16A then you can use one 20A circuit with number 12 wire.
If it is more than 16A, I think you are required to use 2 circuits.

No, but you can use 12/3 without connecting the white to anything. (cap it. don't cut it)