Resolving problems in the infrared
In order to resolve something, the wavelength of light must be considerably smaller than the object. We can’t resolve a baseball with radio waves. On the other hand, ultra-violet radiation ionizes biological tissue and causes cancer; it might have greater resolving power, but it resolves too much.
The visible spectrum covers wavelengths from about 390 to 750 nm. The spectrum of atomic transitions most common to our environment don’t ionize the tissues from which we’re made. The infrared goes up to about 2500 nm and is generated by molecular vibrations and rotations, which is to say: heat.
Both thermal and optical imaging operate from the same principle. Electromagnetic radiation is focused on an array of pixels. Each pixel is a CCD (charge coupled device). Light hits the CCD, liberates electrons through the photo-electric effect (for which Einstein won the 1921 Nobel Prize in Physics), and the CCD amplifies the signal to light up that pixel with the appropriate color. The difference between infrared and optical imaging is in the medium that the light hits.
The photo-electric effect is pretty simple. A photon knocks an electron off of a cathode and is accelerated through some voltage to a detector. The energy of the electron at the detector is proportional to the frequency of the incident light. There’s a catch, though; the photon energy has to be larger than the energy with which the electron is bound to the medium of the cathode. Einstein called that energy the work function because it’s the minimum amount of work that the photon has to do to pry the electron loose. The big difference between optical and thermal imaging is that energy. Infrared light is considerably weaker than optical light, so thermal imaging CCDs must have cathodes with lightly bound electrons.
One big difference between seeing with visible light and “seeing” with infrared light is the source. Step outside into the light of day and scattered sunlight reflects from objects into your eyes – we essentially see things through “light echoes.” Infrared images, on the other hand, generate their own radiation. In a thermal image, we actually see the very radiation created by whatever we’re looking at.
Objects cooling after a hot day or warm blooded organisms wandering around in the dark radiate in the infrared so it’s good for seeing at night – standard military and outdoors applications. It’s also used for diagnosing problems in engines both mechanical and biological.
Gemma Porter-Rawlings http://veterinary-thermal-imaging.com/about-us/our-thermographers is a Veterinary Thermographer in England who performs IR scans of horses. An infrared image is essentially a map of skin temperature. A natural consequence of injury is excessive blood flow to provide nutrients and energy for healing so thermal images light up extra bright near injuries.
At Semicon West a couple of months ago Jason McGinnis showed me some thermal imaging equipment, but what really caught my eye was a one-pixel infrared detector, the Fluke 62 Mini Infrared Thermometer, which measures the temperature of distant objects. With a ten to one spot to distance ratio, it measures the average temperature over an area that’s a tenth the distance. For example, if I wanted to measure my kid’s temperature, but don’t want to get close enough to be infected, I could stand ten inches away, point the single-pixel at the kid’s forehead and it will report the temperature averaged over one square inch of forehead.
An aside on resolution: Resolution is the ability to distinguish two circles. The great 19th century physicist, Lord Rayleigh (the same guy who provided the physics of why the sky is blue), defined the angular limit of resolution: 1.22 λ / D, where D is the diameter of the receiver’s aperture (e.g., the diameter of your pupil), λ is the wavelength of the radiation, and 1.22 comes from Bessel function arithmetic.
Fluke 62 Max+ IR thermometer uses dual lasers