Time waits for no man. Keeping this mind, we set about inventing clocks to synchronize ourselves with lunar cycles. It was an impressive leap, beginning some 10,000 years ago when the Egyptians came up with sundials in 2100 BC. Sundials, or shadow clocks, first used by Sumerians, worked on the premise of measuring the length of shadows to deduce time of day. Weather played spoilsport as on cloudy days, and when the months changed, shadows would not correspond with the markings. The Romans tried to do better by pilfering Cleopatra’s Needles, the tools used by Egyptians, but had to be content with town criers announcing the changing time. Around 325 BC, the water clock followed sundials; a water clock was basically a bucket of water with a hole in bottom to record slipping time but not hours. Various contraptions and models followed, ultimately resulting in clocks.
The word clock has its genesis in French word “cloche,” meaning bell. The first clock used weights to move gears, which in turn moved the hands. The one problem was that someone had to reset the weights until weight was propped by an oscillating horizontal bar attached to vertical spindle with protrusions to act as diversions. Soon, springs replaced weights, reducing the size of the clocks that could be carried, kept on a mantelpiece, or hung as wall clocks. Mechanical clocks and watches gave way to electronic timepieces with quartz crystal, later to be surpassed by atomic clocks.
Accuracy is the hallmark of atomic clocks, which are turning out to be more reliable and uniform when compared with time deduced by the rotation of earth. Atomic clocks operate by measuring the resonant frequency of a given atom i.e., Cesium, Hydrogen, or Mercury, increasing exactness more than a billionth of a second per day. It is this accurateness that has made atomic clocks more dependable as alarm clocks for domestic, scientific, or public functions.