When Was The Atomic Clock Invented?
In 1945, Columbia University physics professor Isidor Rabi suggested that a clock could be made from a technique he developed in the 1930s called atomic beam magnetic resonance. By 1949, the National Bureau of Standards (NBS, now the National Institute of Standards and Technology, NIST) announced the world’s first atomic clock using the ammonia molecule as the source of vibrations, and by 1952 it announced the first atomic clock using cesium atoms as the vibration source, NBS-1.
In 1955, the National Physical Laboratory in England built the first cesium-beam clock used as a calibration source. Over the next decade, more advanced forms of the clocks were created. In 1967, the 13th General Conference on Weights and Measures defined the SI second on the basis of vibrations of the cesium atom; the world’s time keeping system no longer had an astronomical basis at that point! NBS-4, the world’s most stable cesium clock, was completed in 1968, and was used into the 1990s as part of the NIST time system.
In 1999, NIST-F1 began operation with an uncertainty of 1.7 parts in 10 to the 15th power, or accuracy to about one second in 20 million years, making it the most accurate clock ever made (a distinction shared with a similar standard in Paris).