Atomic Clocks, Accuracy, and (Re)Defining the Second

Today, time as we know it is kept via an intricate network of approximately 400 atomic clocks operating worldwide. Recently, however, scientists have developed a timekeeping system that may one day make the atomic clock obsolete.

NIST’s first atomic beam clock, 1949

But before we get to that, let’s first take a look at time itself. Fundamentally speaking, time is measured by tracking the intervals of something happening repeatedly, e.g. the swings of a pendulum in a grandfather clock. The less variation there is within a timekeeping system, the greater the resulting accuracy. A second was once defined as 1/86,400 of the mean solar day, but due to irregularities in the Earth’s rotation it was eventually deemed an imprecise way to measure time. Nevertheless, it was the definition used by astronomers well into the ‘50s. Atomic clocks came about when scientists realized that time could be measured more precisely by tracking the movements of something more consistent and not so influenced by outside forces. That something was the atom.

Because all atoms inherently have oscillation frequencies defined by the mechanical nature of the atom itself, atomic clocks are based on the observable oscillations of substances on an atomic scale, specifically the transition between two energy states of an atom. Atomic clocks have existed in one form or another since the late ‘40s, but it wasn’t until we began to use cesium 133–an isotope of the element cesium–as our oscillation source that we achieved the accuracy we see today with modern atomic clocks (there are some atomic clocks that feature less precise designs based on hydrogen and rubidium.)

Atomic clocks do not rely on atomic decay, so they’re not radioactive. Rather, they measure the number of times the atom switches between states to relay time, and since 1967 the International System of Units has defined the second as the time that elapses during 9,192,631,770 cycles of radiation that is produced by the transition of cesium 133 between two energy levels.

Louis Essen developed the first accurate cesium atomic clock at National Physical Laboratory, 1955

In the ‘30s, a Columbia University physics professor by the name of Isidor Rabi developed a technique called atomic beam magnetic resonance, which allowed for the observation and recording of the magnetic properties of atoms. Based on his theory, the National Institute of Standards and Technology (then called the National Bureau of Standards) developed the world’s first atomic clock using ammonia as the vibration source. Three years later the NBS announced the first cesium-based clock, the NBS-1. One of its eventual successors, the NBS-4, was completed in 1968, and it was the most stable cesium clock built well into the 90s. Today, the United States operates the most accurate atomic clock ever built, the NIST-F2 unveiled in 2014.


Current atomic clocks are off a single second every hundred million years. That begs the question:  why would we ever need such accuracy when are daily lives are so inexact? Turns out, there are tons of critical applications that rely on the highly synchronized timekeeping afforded by atomic clocks, among them satellite navigation systems, telecommunications, and even international financial markets. With some satellites, for example, synchronization needs to be accurate to within 1/100th of a microsecond to avoid positioning errors, which can be massive even with a microsecond of discrepancy.

Accuracy aside, atomic clocks wouldn’t be all too useful if there weren’t a way to communicate the time signals. As a result, the Time and Frequency division of the NIST and the U.S. Naval Observatory maintain a system of radio transmitters near Fort Collins, Colorado, designed to broadcast these signals within 2,000 miles of their location. These broadcasts are absolutely critical for the operation of multiple navigational systems, and they’re what radio-controlled wristwatches use to calibrate time in the United States. Speaking of wristwatches, it should also be noted that Hawaiian watch brand Bathys created the first recognized atomic wristwatch–the Cesium 133–using a “chip scale” cesium-based oscillator.


Scientists have since developed optical clocks, which are even more accurate than the atomic clock. These devices measure ions that vibrate even faster at frequencies about 100,000 times higher than those of atomic clocks, allowing for even greater precision. Theoretically, these clocks would only lose a second every three hundred million years. The downside to these clocks, however, is that they’re incredibly complex machines, and as a result they experience downtimes where they cannot be used at all, arguably ruining any real-world application.

Strontium Optical Lattice Clock at the National Physical Laboratory, United Kingdom

In a recent study published in Optica, however, researchers have proposed a solution to the aforementioned issue. Their idea? A system blending a commercially available atomic clock device called a “maser” with a a strontium optical lattice to create a redundancy that will compensate for the latter’s downtime periods. The team operated the system for 25 days and even with a downtime of two days, they found that the clock would only lose 100 seconds over 14 billion years; to put things into perspective, that’s the known age of the Universe.

The paper also argues that should this become a viable standard for time keeping in the years to come, a redefinition of the second may be required to reflect the increased accuracy. That said, the technology is in the early stages, so we won’t be seeing the redefinition in the next couples of years, but likely within our lifetimes.

Related Posts
Ilya is Worn & Wound's Managing Editor and Video Producer. He believes that when it comes to watches, quality, simplicity and functionality are king. This may very well explain his love for German and military-inspired watches. In addition to watches, Ilya brings an encyclopedic knowledge of leather, denim and all things related to menswear.