Been using a 10-gauge cord to connect my little 2kw gennie in the garden shed to the transfer switch in the garage.
No problems in maybe 20 hours total usage and the numbers look right per
My understanding being that the voltage drop should be less than 5% at full load.
Now I am going to add a second 2kw and run them in parallel. (Honda EU2000i + EU2000i Companion). viz:
In outage mode, the house cruises on 800-1200 watts so 95% of the second gennie's function will be redundant backup in case there's a problem with one unit during an outage.
But since I will have two.... why not? Then we could fire up both during mealtimes and accommodate a toaster or a coffee maker... or the kitchen's big microwave.
With the two units connected in parallel, the power cord interface changes from a regular 3-prong plug to a 30-amp L5-30 twist-lock plug.
Running the numbers (Copper wire, 10 AWG, 120v, AC single phase, single set of conductors, 100' distance, load current 30 Amps) into
Less than the 5% limit - and, real-world, not just .1% less because the two gennies will never be putting out a full 30 amps.
I'm thinking I can keep using the current 10-gauge cord by making up an adapter with a female 3-prong receptacle on one end and an L5-30 male plug on the other end.
That way, one cord does it whether I use one gennie or two; cord winding/storage remains merely inconvenient instead of becoming difficult; and I avoid shelling out $250-$300.
Anybody see a flaw in this reasoning?