This just blows my tiny mind. The latest atomic clock is accurate to within
1 second in 5 billion years. Under 3 seconds for the entire life of the universe since the big bang. Slightly better than the 8 quid digital watch I bought off Ebay a few years ago.
Which in turn is probably more accurate than the hundred quid watch that My grandad was given when he retired in the 1950s.
There's an interesting note in the article saying that becuse the second is defined by a caesium clock, that is the only type of clock that can be said to be completely accurate at the moment. All other clocks' accuracies have to be quoted by reference to one.
The thing is though what is it accurate to? If the effect we see of the Universe speeding up in its expansion, this could also be seen as time changing instead, but as we are inside the universe, we cannot tell which it is. that is of course if time has any meaning outside of it. Brian
It's not, that is, it's been defined, as the duration of 9192631770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom.
It depends on what is meant, in any particular statement, by accuracy. A c lock can be used to provide a local represetation of absolute time, such as is used to see when a train should leave a station; and it can be also be used to measure an interval, such as when boiling an egg.
Since the SI second depends on caesium, no clock can realise the SI second more accurately than the best possible collection of caesium clocks.
But the best modern clocks produce a much more regular rate than the best c aesium clocks - they provide a more accurate time scale but with uncertain calibration.
clock can be used to provide a local represetation of absolute time, such as is used to see when a train should leave a station; and it can be also b e used to measure an interval, such as when boiling an egg.
d more accurately than the best possible collection of caesium clocks.
caesium clocks
What are these best modrn clocks ?
No they can;t because the second is defines by the caesium atom at 0K and s ince there's no 0K everything has to have compensation applied if it's not a caesium 133 at 0K, yes even caesium at room temerature would be inaccura te when compared with it at 0K.
Yep, all we've done is make a really accurate abacus that counts an arbitrary number of things that make up what we call a "second", a unit that the universe couldn't give a fig about.
Only if you accept that the purpose of the SI definition of the second is entirely wedded to the physical limitations of Caesium (which has been the most reliable frequency standard for quite a while now).
Once an even more stable and long term reliable frequency standard for clock time comes along then it will supplant the old gold standard.
H-maser (old technology) and the newest Ytterbium and now Strontium atomic clocks all have better precision and stability. The former have been used for VLBI for a very long time now and the latter through using a higher frequency can offer a more precise definition. H-masers are really good over shorter periods but glitch sometimes.
The internationally accepted definition of the second has altered over the years as ever more precise methods for measuring time have been developed. It is highly likely that if the new clock is as well behaved as they claim that the SI unit of time will be redefined in terms of this higher more reproducible and precise reference standard frequency.
The fact that the new standard is more intrinsically stable means that errors in the ensemble of clocks that determine current atomic time based on Caesium can in principle and by the looks of it in practice be better measured against this new frequency reference standard.
NB we are into territory here where adjusting the feet on the lab bench would move the emitter wrt the Earth's gravitational field and so affect the frequency.
It started out as 1/86400 th of a mean solar day. Blame the Babylonians for that = 24x60x60. Unfortunately the true sun moves differently because Earth's orbit is elliptical and the periodic effects of the moon. So if you take photos at local noon mean solar time the real sun forms a very pretty pattern in the sky called an analemma (which is not helpful for using it as a standard time reference).
formatting link
Astronomers defined the second in terms of the sidereal day in practice
- that is successive transits of a fixed star was once the definitive standard until precision temperature compensated clocks came along.
Greenwich were masters of the transit art which involved observing a star directly and reflected in a pool of mercury under the scope. It is as a result of their expertise that we have the Greenwich Meridian line (now sadly devalued to a rather poor brand of gin).
See:
formatting link
Unfortunately once clocks reached ovened quartz crystal accuracy and atomic transitions it became clear that the Earth's rotation was not uniform and varies with seasonal and various other perturbations. eg
Ah but as more nothingness is pumped into the universe to make it appear to be expanding faster, this surely changes the spaces inside atoms as well and hence the accuracy, but of course as everything changes, its only over huge distances that we notice the change. Brian
HomeOwnersHub website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.