CCD and CMOS sensors become more finely tuned
Many of the characteristics that engineers look for in machine-vision cameras are determined by the image sensor at the heart of that camera. High-quality images, high resolution, high frame rates, greater sensitivity to light, and the ability to capture moving images without distortion are all dependent on CCD or CMOS sensor chips.
Typically, the main goal in machine vision is not to acquire the perfect picture but to extract the right information for the application in which the imaging system is used, said Danny Scheffer, director of business development for custom imaging sensor products for Cypress Semiconductor, a maker of CMOS sensors. Examples of that information are the presence or absence of components on a board, the location and orientation of a component in an assembly, the type of component, and the information stored in labels. “The application sets the characteristics of the sensor to be used, including variables such as speed requirement, device size, light source, and whether the light environment is controlled,” he said. “All of these in combination determine [what you need to look for in] pixel size, resolution, frame rate, and shutter type.”
The 1.3-Mpixel LUPA 1300 CMOS sensor has 12 high-speed outputs, windowing capability, and a fully synchronous snapshot shutter that makes it possible to capture images of moving objects without distortion and to read one image while the next one is being acquired. Courtesy of Cypress Semiconductor.
A quick comparison of CCD and CMOS sensors would find that CCD sensors, which can offer resolutions as high as 22 Mpixels, provide higher image quality at a correspondingly higher price. In contrast, the lower-cost CMOS sensors, with their much smaller 1-Mpixel to 4-Mpixel resolutions, capture images more quickly, which is desirable in a high-speed production line.
In applications where the highest image quality is needed, such as wafer and mask inspection, customers are willing to pay the higher price of CCD sensors. CCD technology produces higher-quality images because it was developed specifically for imaging, said Michael DeLuca, marketing manager for Eastman Kodak’s Image Sensor Solutions group, a maker of CCD sensors. “In order to make CMOS capable of imaging, it had to be adapted to make it sensitive to light,” he said.
While the camera’s frame rate is important in maintaining productivity as objects move down an assembly line, so is the time it takes to process an image. “Cameras used in these applications tend to include only limited post-processing circuitry, since you want to analyze the image as quickly as possible, rather than spend time to fix the image first,” DeLuca said. “As a result, the CCD sensors we sell to this market tend to be of very high quality. This is one reason that CCD technology continues to have a good presence in these markets: It can provide very high-quality images directly from the sensor.”
The electronic shutter that is inherent in progressive-scan, interline-transfer CCDs is also important in inspection applications because it eliminates the need for a moving part—a mechanical shutter—by providing clean image capture of moving objects, said DeLuca. “This is a real advantage of CCD image sensors, as CMOS devices are susceptible to artifacts such as image skew when using a rolling shutter.”
Nevertheless, CMOS chips have made great strides in electronics and semiconductor inspection because they can deliver higher imaging speeds at a lower cost in applications such as laser profiling on flip-chip and BGA (ball-grid array) packages, said David Cochrane, director of product management and marketing for Dalsa, which makes both types of sensors. Cochrane pointed out, though, that although CMOS delivers a lower cost of implementation than CCD, there are fewer CMOS models to choose from because they are newer in the industry.
Joost Seijnaeve, Cypress Semiconductor’s marketing director for standard imaging products, concurred with Cochrane’s assessment of the market. “The data quality of CMOS has improved,” he said, “and combined with its integration ability, CMOS sensor technology is taking market share away from CCD technology.” Seijnaeve explained that because CCD sensors can’t remove the charge as fast as is needed in high-speed imaging, phenomena like ghosting can occur. In addition, CCD sensors also have much higher power-dissipation levels than CMOS devices.
Cliff Drowley, VP of Cypress Semiconductor’s imaging business unit, explained that the historical advantage of CMOS has been the fact that all of the timing and analog signal chain circuitry can be integrated onto the same chip as the sensor array, simplifying the bill of materials and minimizing camera size. “CMOS frame rates are also faster, since you can do pipelined shutter schemes,” he said. Because of the nature of clocking in CCDs, it’s extremely difficult to use them for implementing high-speed cameras. CCDs are not as good as CMOS sensors at performing high-speed capture or high-speed data transfer off of the device.
In order to get high frame rates in CMOS sensors, many parallel outputs are required, from 64 to 128 or more, just to get the data off-chip. “The pixel types for high-speed global, or snapshot, shutters are very specialized,” he said. For very high-speed applications of several thousand frames per second, sensors also incorporate a pipelined global shutter. This consists of on-pixel storage circuitry, so that one frame can be captured while the previous frame is read out.
“We also aim at optimizing the sensor’s modulation transfer function, which is a measure of the sharpness of the image detected on the sensor, for machine vision and particularly for high-speed designs,” said Drowley. “To expand the dynamic range of CMOS sensors so they can adapt to different light levels, we can program a nonlinear response curve on the sensor.”
Drowley said that CMOS is also closing the gap in another area where CCD sensors have had a major advantage: noise level. Until the late 1990s, CCD had the advantage over CMOS in the way the charge was read out. CCD sensors could eliminate a lot of temporal noise by using correlated double sampling, he said. “In 1998, the first commercial CMOS imagers with true correlated double sampling for noise appeared. In many applications today, noise performance is equivalent in CMOS and CCD technology.”
Trends in image sensors
Customers want electronic shutters and the speed of CMOS, but they also want high image quality, lower read noise, and higher sensitivity for the more demanding applications, said Dalsa’s Cochrane. Some customers are demanding larger-sized pixels to increase a sensor’s sensitivity, because as inspection speed increases, light, and therefore the signal, becomes more difficult to capture, he said. “The recent trend to smaller pixel sizes in consumer imaging does not fit well in the industrial segment, since many of these devices have higher pixel defect rates and are only suited to the more basic inspection tasks. Electronics and semiconductor inspection demands the highest image quality, and some applications require perfect imagers.”
Although big pixels can store more charge, increase the dynamic range, and lower the noise floor, small pixels may be needed to get enough of them in a particular optical format, said DeLuca. “Many of the qualities needed in an image sensor can’t be optimized in the same directions,” he said. “For example, high resolution requires high pixel counts, but it then takes additional time to read out those extra pixels, lowering frame rate.” DeLuca said that in Kodak’s newest family of interline CCD image sensors, pixel area size has been reduced by about 50% from the previous generation, while performance has been maintained and even improved in some areas, such as frame rate and image smear.
Today, the demand is for machine-vision sensors with 90 to 100 fps at about 1.3 Mpixels, and in the near future, demand will be for 100 to 200 fps and 2 Mpixels, said Seijnaeve of Cypress. “The next generation of machine-vision sensors will require frame rates above 100 fps, resolutions of 3 Mpixels or higher, standard optical formats of 1/2 in. to 2/3 in., low-light sensitivity, high dynamic range, global shutters, and windowing.”
Sensor improvements that Cypress is working on now include enhancing image quality, increasing optical dynamic range, and increasing functionality and flexibility, Scheffer said. The company is also integrating post-processing functions, such as ADCs and gain amplifiers, timing generators, and everything else that’s needed for easy integration of the sensor into the end system.
“The next step will be more configurability, or programming, on the sensor,” said Scheffer. “In the future, we expect to integrate color construction and pixel-correction algorithms. CMOS image sensor features are moving from image improvement toward the integration of more image-processing and post-processing features.”
Resolutions greater than 10 Mpixels are the latest trend in CCD and CMOS area devices, said Cochrane. “Linescan CCDs still rule and have reached 12k and even 16k pixels in custom models,” he said. But in the near future, standard image capture technology will exceed 1 Gpixel/s, which is faster than the data transfer rates of current frame grabbers and PC interfaces. “As sensors become this fast, you won’t be able to send data to the PC at the rate it’s being acquired,” Cochrane explained.
Because in machine vision there’s usually a region of interest and certain specific information users need to get from the image, Cypress sees a trend toward sensors that incorporate image-processing algorithms that let you extract only the data you want, said Drowley. “This processing capability entails integrating a lot more digital circuitry on the sensor itself, which CCD sensors can’t do, but which CMOS technology does well. Right now, this is a relatively small piece of the machine-vision space, but it looks like something that will grow.”