Applied Physics

Infrared Detectors Lose Their Cool

+ See all authors and affiliations

Science  26 Jan 2001:
Vol. 291, Issue 5504, pp. 555-557
DOI: 10.1126/science.291.5504.555f

Infrared (IR) detectors have numerous applications ranging from thermal imaging to surveillance. Currently, detectors generally are based on a reversed-biased junction comprising a low-energy band gap semiconductor that separates and detects the photogenerated charge created by an absorbed IR photon. For high sensitivity, these materials must be maintained at low temperatures, and thus refrigeration hardware is needed.

New materials that do not require cooling are being developed for IR detection. Temperature-sensitive ferroelectric thin films exhibit a dielectric constant that is sensitive to temperature changes, producing a net charge transfer proportional to temperature when subjected to an applied electric field. Fuflyigin et al. prepared high-quality, free-standing ferroelectric thin films (lead scandium tantalum oxide) using a sol-gel technique and obtained a sensitivity of 20 to 45 nanocoulombs per square centimeter per degree Kelvin for films 200 to 300 nanometers in thickness. Moreover, the relatively low preparation temperatures (800° to 900°C) make them strong candidates as sensitive and cheaper thermal detectors. — ISO

Appl. Phys. Lett.78, 365 (2001).

Related Content

Navigate This Article