The UK power supply wasn't harmonised in any way. The tolerances it works to were simply increased to bring in the more common european voltages.
The UK power supply wasn't harmonised in any way. The tolerances it works to were simply increased to bring in the more common european voltages.
Nothing was changed except the tolerances. What you get out of the socket now is exactly what you got before the "harmonisation". By changing the tolerances, all that has happened is that manufacturers now have to produce equipment capable of working anywhere from 220V to 240V. IIRC, the nominal value is now 230V +/- 10% rather than 240V +/- 6% As you will notice this covers the 220V used on the continent, 230V in Ireland and 240V here.
To change the true value of Voltage at your socket would require a lot of expensive re-engineering of the generation and supply networks and is obviously not going to be done any time soon!
I think there's a section in the FAQ about this.
As to exactly how a meter works, and whether it continues to charge you correctly when what is coming through isn't exactly 240V, I know that others here are much more capable of answering that one.
Hwyl!
M.
No, they calculate instantenous power, in terms of RPM of the disk. They self adjust to any voltage fluctuations.
Nah, they calculate energy useage on the basis of the actual voltage and the actual current.
What about electronic ones (we have a tiny white thing with a flashing led, which flashes faster the more power you draw)?
The price per unit may not change, but the real power consumed by any appliance with an element will change with voltage, so the same item will still cost varying amounts of money to run...
Ok the difference will be tiny :-)
Lee
Power consumed is V x A = W An electric meter is simply a recording Wattmeter. If the voltages were to change which they do (fluctuations up and down) the meter would still record the power consumed and it's the power consumed that you pay for
Steve R
To be absolutely correct the old mechanical meters measure V x I x cos(phi) (the phase angle between V and I), and are not sensitive to poor power factor equipment (even though the power company has to supply V x I) (You could stick a capacitor across the supply and use lots of immeasurable power!)
I believe the newer electronic ones measure V x I regardless of phase (kilo volt amps reactive or kvars).
Hence if you have a new electronic meter with poor power factor equipment installed you now pay for electricity you can't actually use! (which is actually fair).
They do even better than that -- they integrate V x I x power factor, and don't only handle cos(phi) phase shifts, but other causes of low power factor too such as harmonic content.
Except you don't actually get or use any power (except cable losses, and you will be paying for the loss in cables after your meter).
Penalising for low power factor is only allowed for industrial customers. In the domestic market, it's handled by regulations which limit the harmonic current waveform domestic appliances are allowed to draw.
Bearing in mind most domestic equipment is fairly inductive then adding capacitors brings the pf closer to unity. Better idea would be to remove existing caps and add chokes - hell, you could push it out to 90 degrees with a bit of effort! Richard
Oh good, I hope it will be quietly dropped and forgotten about.
P=v^2/R so with resistve loads, +-10% on the volts is +-20% on the power, roughly. An overall difference of a whopping 40% from worst to best case.
Since a heating element is as near as dammit 100% efficient, you'll not pay anymore for the required electricity to heat to the same temperature - unless heat losses over time come into it.
It doesn't alter the fact that you are still only being charged for what you use. True that heating appliances running at 254 volts will use more energy than those operating at 216 volts, and consequently give out more heat, but the meter will still record the energy consumed. The difference for modern electronic equipment *is* minimal.
Which is sort of what I meant, when I said the difference would be tiny, thermostats and all that :-)
Lee
Mmm. Think light bulbs tho. Or power drills. Or any unregulated appliance that doesn;t switch off when it gets hotter..
>
I believe that LIDL had cheap,13A power consumption meters in store this week. These devices can be surprisingly accurate. I checked one of a similar type against an all singing dancing £nK test set some years back and they agreed within 1%. If you really want a good(?) laugh, just go round the house (takes a few days) and measure the standby consumption of say, TV, answerphone, PC, clocks etc. The results can be surprising. My answerphone/fax was costing£50 a year to run! Regards Capitol
Your assumption is wrong. Standard UK Watt-hour meters measure a time integrated product of current AND voltage to read consumed power. The meter has a voltage coil and a current coil. AFAICR the left hand input connects to the right hand output via the current coil within the meter, the two middle (Neutral) connectors loop straight through to each other but a voltage tapping is made which connects to the current path via the voltage coil. Basically the meter doesn't care what voltage and current are applied to it, it reads the power anyway
Fluddy buck mate!
Mmm.. mine costs better than £500 a year and that's not including the electric (with a wife and 2 daughters I think I'm getting off lightly). Richard
HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.