PHYSICS: Atomic-Scale Length Standard

+ See all authors and affiliations

Science  11 Aug 2000:
Vol. 289, Issue 5481, pp. 833a
DOI: 10.1126/science.289.5481.833a

The unit of length has come a long way since the cubit (the distance from fingertip to elbow) was used in the ancient world. Naturally, this was not very useful for accurate and reproducible measurement, and subsequent standardization of length resulted in the meter. Demands for increased accuracy have displaced the meter with the distance traveled by light from a helium-neon laser in 1/299,772,458th of a second; but for atomic-scale distances, even this measure is too coarse. At present, the most accurate definition of length is based on the lattice constant of ultrapure crystalline silicon held under strict temperature and pressure conditions, which makes widespread reproduction of results between different laboratories difficult.

Shvyd'ko et al. have updated a technique that is relatively insensitive to ambient temperature and pressure. By calibrating the γ-rays emitted from an excited iron-57 atom against almost exact Bragg backscattering from a reference silicon crystal, they found that the γ-ray wavelength can be determined to be 0.08625474 nanometers with an accuracy of better than 0.19 parts per million. — ISO

Phys. Rev. Lett. 85, 495 (2000).

Related Content

Navigate This Article