Been using a 10-gauge cord to connect my little 2kw gennie in the garden
shed to the transfer switch in the garage.
No problems in maybe 20 hours total usage and the numbers look right per
http://tinyurl.com/nvkyq32 , which says the voltage drop will be 2.5% at
My understanding being that the voltage drop should be less than 5% at
Now I am going to add a second 2kw and run them in parallel. (Honda
EU2000i + EU2000i Companion). viz: http://tinyurl.com/pjwqxbp
In outage mode, the house cruises on 800-1200 watts so 95% of the second
gennie's function will be redundant backup in case there's a problem
with one unit during an outage.
But since I will have two.... why not? Then we could fire up both
during mealtimes and accommodate a toaster or a coffee maker... or the
kitchen's big microwave.
With the two units connected in parallel, the power cord interface
changes from a regular 3-prong plug to a 30-amp L5-30 twist-lock plug.
Running the numbers (Copper wire, 10 AWG, 120v, AC single phase, single
set of conductors, 100' distance, load current 30 Amps) into
http://tinyurl.com/nvkyq32 , I get a voltage drop of 5.99 (4.99 percent).
Less than the 5% limit - and, real-world, not just .1% less because the
two gennies will never be putting out a full 30 amps.
I'm thinking I can keep using the current 10-gauge cord by making up an
adapter with a female 3-prong receptacle on one end and an L5-30 male
plug on the other end.
That way, one cord does it whether I use one gennie or two; cord
winding/storage remains merely inconvenient instead of becoming
difficult; and I avoid shelling out $250-$300.
Anybody see a flaw in this reasoning?