In this case, I'm asking how a watt calculation made by taking an average RMS reading for a few minutes from a clamp-on RMS amp meter on a supply phase and multiplying that by 120 to get VA would differ from the "watts in use" as being measured by the utility's billing meter during the same few minutes.
The question did not involve the various uncertainties as to whether the meter is employing "demand-meter" metering, or the uncertainties as to whether or not extrapolating a few minutes of average current draw is representative of an entire month's usage.
The answer, as posted by a few sane people who looked at the original question and gave an actual answer, is that yes, VA can be considered as equivalent to watts when all loads are resistive and power-factor is unity (ie - 1).
To the extent that my facility's appliances, devices and equipment have non-linear (ie inductive) loads, my measure of VA == watts *will* be an over-estimate of what the meter is measuring during the time-frame of the measurement.
That over-estimation will most likely not exceed 10% because the worst offenders for having low power-factors are small fractional-hp electric motors, for which I have two (one in each bathroom of this building as ceiling ventilation fans) and for which I would estimate their use to be perhaps 3 hours per day. In addition to those two fans, the building's furnace has a 220 VAC furnace fan motor which I would estimate to be 3/4 to 1 hp and the daily usage (for the present time) to be 1 hour per day or less.
I've been using it since 1988 (that's about 23 years now).
And I've also seen far too many questions asked and then the thread is hijacked by others who want to take the question on a tangent because they don't know how to answer the original question but instead feel a compulsion to post something to prove they are worthy or relevant to the group.