# It is claimed that two cesium clocks,

Question.

It is claimed that two cesium clocks, if allowed to run for 100 years, free from any disturbance, may differ by only about $0.02 \mathrm{~s}$. What does this imply for the accuracy of the standard cesium clock in measuring a time-interval of 1 s?

solution:

Difference in time of caesium clocks = 0.02 s

Time required for this difference = 100 years

$=100 \times 365 \times 24 \times 60 \times 60=3.15 \times 10^{9} \mathrm{~s}$

In $3.15 \times 10^{9} \mathrm{~s}$, the caesium clock shows a time difference of $0.02 \mathrm{~s}$.

In $1 \mathrm{~s}$, the clock will show a time difference of $\frac{0.02}{3.15 \times 10^{9}} \mathrm{~s}$.

Hence, the accuracy of a standard caesium clock in measuring a time interval of $1 \mathrm{~s}$ is $\frac{3.15 \times 10^{9}}{0.02}=157.5 \times 10^{9} \mathrm{~s} \approx 1.5 \times 10^{11} \mathrm{~s}$.