It's all a matter of 'history'. In effect, the 'standard' has changed over time.
The 'nominal value' for residential electricity delivery has climbed over the years.
Circa WW II, it was 110 VAC, By the mid 50's, 115 VAC, By the mid 60's, 117 VAC, Somewhere in the 70's 120 VAC.
When somebody 'casually' refers to _any_ of those numbers, they're talking about 'contemporary residential electrical service', which is actually 120VAC.
The difference in these voltages is, for _practical_purposes_, *not* significant. Particularly, among 115, 117, and 120.
As for the difference between "110", and "220" (or 115 and 230, or 120 and 240), that's just the "normal" two voltages you get from the power company. From the transformer on the pole, they bring in _two_ 'hot' wires, and a 'neutral'. From either 'hot' to neutral, you get 110 (actually, today, 120). By going from one hot to the _other_ hot, you get 220 (actually, today, 240). The higher voltage is used to run 'high power' equipment (e.g. electric stove, electric water heater, electric clothes dryer) more efficiently. If you're going to run medium-big power tools, it is advantageous to run them on the higher voltage. Obviously, this requires that you _have_ that kind of power available where you plan to use the tools.
From the sounds of your set-up, it'd be worthwhile getting an electrician in to at least estimate installing some extra circuits, whether of the
120 or 240 variety.