Design Con 2015

Automobile sensors may usher in self-driving cars

-May 26, 2011

Automobile sensors may usher in self-driving cars imageGoogle last year demonstrated the results of its research-and-development efforts to create an autonomous vehicle. The small fleet of specially equipped cars—six Toyota Priuses and one Audi TT—has logged more than 140,000 miles of daytime and nighttime driving in California, including traversing San Francisco’s famously crooked Lombard Street and the Los Angeles freeways (Figure 1). In all cases, an engineer was in the driver’s seat, monitoring each car’s performance and ready to take over if necessary.

A robocar of the future would be so intelligent that its driver would be able to read, play, or work rather than piloting the car. The benefits would include safety, freeing up the driver for other tasks or recreation, and the more effective use of the traffic infrastructure due to more efficient traffic regulation and fuel efficiency.

Motor-vehicle accidents are the leading cause of death of 13- to 29-year-olds in the United States. According to Sebastian Thrun, an engineer at Google and the director of the Stanford Artificial Intelligence Laboratory, which created the Google robocar, almost all of these accidents are the result of human error rather than machine error, and he believes that machines can prevent some of these accidents.

Automobile sensors may usher in self-driving cars figure 1“We could change the capacity of highways by a factor of two or three if we didn’t rely on human precision for staying in the lane and [instead] depended on robotic precision,” says Thrun. “[We could] thereby drive a little bit closer together in a little bit narrower lanes and do away with all traffic jams on highways.”

Doubling highway capacity by a factor of three with no added infrastructure costs and freeing an hour or two a day for productive or relaxing pursuits seem like worthy goals, but how close is the auto industry to achieving a practical self-driving car? Google is not in the car-production business and has no business plan for monetizing its research (Reference 1). In Google’s approach, autonomous vehicles will not require a government mandate to become reality. The Google fleet uses LIDAR (light-detection-and-ranging) technology, such as that in a system available from Velodyne’s HDL (high-definition LIDAR)-64D laser-sensor system, which uses 64 spinning lasers and then gathers 1.3 million points/sec to create a virtual model of its surroundings. One reason to use LIDAR rather than radar is that the laser’s high-erenergy, shorter-wavelength laser light better reflects nonmetallic surfaces, such as humans and wooden power poles. Google combines the LIDAR system with vision cameras and algorithmic vision-processing systems to construct and react to a 3-D view of the world through which it is driving (Reference 2).

The enabling sensor hardware in the vehicles enables the cars to see everything around them and make decisions about every aspect of driving, according to Thrun. Although we are not close yet to a fully autonomous vehicle, the technology, including the sensor platform of radar, ultrasonic sensors, and cameras, is available in today’s intelligent vehicle. It remains only to standardize the car’s hardware platform and develop the software. Cars are approaching the point that smartphone platforms had reached just before the introduction of the Apple iPhone and the Motorola Android.

As sensors decrease in price and increase in integration, they will become ubiquitous in all cars. Once users accept them as normal parts of a car, then automotive-OEM companies can integrate more intelligence into them until they achieve the goal of an autonomous car. Today’s intelligent automobile can perform many driver-assistance tasks, such as avoiding and preventing accidents and reducing the severity of accidents. To perform these tasks, the vehicles have passive safety systems, such as air bags and seat belts; active safety systems, such as electronic stability control, adaptive suspension, and yaw and roll control; and driver-assistance systems, including adaptive cruise control, blind-spot detection, lane-departure warning, drowsy-driver alert, and parking assistance. These systems require many of the same sensors that the autonomous car requires: ultrasonic sensors, radar, LIDAR systems, and vision-imaging cameras.

Cars now use ultrasonic sensors to provide proximity detection for low-speed events, such as parallel parking and low-speed collision avoidance. Ultrasonic detection works only at low speeds because it senses acoustic waves; when the car is moving faster than a person can walk, the ultrasonic sensor is blind.

Although ultrasonic-sensor technology is more mature and less expensive than radar, car designers who care about the aesthetics of the car’s appearance are reluctant to have too many sensor apertures visible on the car’s exterior. As a more powerful and more flexible technology, radar should begin to replace ultrasonic sensors in future designs (Figure 2).

Automobile sensors may usher in self-driving cars figure 2

Radar works in any type of weather and has short-, medium-, and long-range characteristics. For example, adaptive cruise control works in the long range, looking 200m in front of the car, tracking the car, and accelerating or braking the car to maintain a certain distance. Radar also provides blind-spot detection and lane-departure warning. Early versions of these systems audibly warned the driver of an impending problem, but some implementations now take control of the car to avoid the problem. For example, the 2011 Infiniti M56 has an optional blind-spot-warning/intervention system that relies on radar scans from the left and the right rear quadrant of a car. If the radar system detects a car in the driver’s blind spot, a light comes on. If the driver activates the turn signal, an audible beep comes on. If the driver persists and starts to move into another lane, the car gently applies brakes on the opposite side of the car, moving the car back into the center of the lane (Reference 3).

Most automotive radar systems currently are not highly integrated, taking up significant space, and are costly. Analog Devices’ recently introduced AD8283 integrated automotive-radar-receiver analog front end represents the increasing integration that decreases the size and cost of automotive radar (Figure 3). It will sell for about 50% less than a discrete design for an automotive analog front end and fits into a 10×10-mm package. “The market is moving toward putting radar into nonluxury vehicles—cars for the rest of us,” says Sam Weinstein, product manager for the Precision Linear Group at Analog Devices. The sample price for a six-channel AD8283 is $12.44 (1000).

Automobile sensors may usher in self-driving cars figure 3

IR (infrared) LEDs and photosensors find use in automotive applications, such as rain sensing/wiper activation on the BMW 7 series and the Ford Edge. Sophisticated IR cameras enable safety applications, such as drowsy-driver sensing, which is also an option in the Mercedes E550 sedan. Drowsy-driver sensing uses an IR camera to watch the driver’s eyelids to tell whether they are blinking rapidly, indicating that the driver is alert, or blinking slowly or even closing. The car emits an audible warning or vibrates the driver’s seat.

Out-of-position sensing similarly uses IR cameras. Today’s passenger seats must have pressure sensors to determine the weight of the passenger and use the information to deploy the passenger’s air bags. The air bags deploy at different speeds, depending on the weight of the passenger. This sensor does not know, however, whether the passenger is leaning on the dashboard, reclining in the seat, or moving to the left or the right. The closer the passenger is to the deploying air bag, the greater the impact. The camera monitors the passenger’s position, and, upon impact, deploys the air bag appropriately to the passenger’s size and position.

These cameras use IR LEDs rather than those in the visible spectrum because they must be able to work at night. It would be distracting to illuminate the driver or the passenger with visible light for the camera to sense. The human eye detects light as visible at distances as great as approximately 700 nm, whereas IR cameras detect 850- to 900-nm-distant light.

IR imaging also has a place outside the car for crash avoidance, and these applications require IR illumination. According to Sevugan Nagappan, marketing manager of the infrared business unit at Osram Opto Semiconductors, IR cameras can help in collision avoidance by seeing beyond what the high beams illuminate. “IR-LED illumination allows you to see when you can’t have your high beams on to see past your headlamps, for example, allowing the system to see beyond the headlights to see and avoid a deer entering the road,” he says.

Automobile sensors may usher in self-driving cars figure 4IR LEDs’ primary use has so far been in remote controls. However, these inexpensive LEDs use 10 mW or less of power. Automotive applications require output power of greater than 1W to illuminate the subject. In addition, the IR LED must be small enough to fit next to the IR camera and be inconspicuous. Nagappan estimates that the camera needs to measure less than 10 mm2, and illuminating the IR LED requires 5 mm2. He says that manufacturers can make LEDs in small packages that can provide 3.5W and that these devices are enabling new applications. Osram’s 3.5W SFH 4236 IR LED has an integral lens with a narrow beam angle to focus the IR light, increase the beam intensity, and focus the beam into the eye box to watch the driver’s eyes.

Innovation is also driving down the cost of the cameras. The Fraunhofer Institute expects to bring to market a camera as small as a grain of salt and costing only a few euros. The resolution currently is 250×250 pixels. These cameras could replace side-view mirrors, reducing airflow drag (Figure 4).

You can reach Technical Editor Margery Conner at 1-805-461-8242 and margery.conner@ubm.com.

Datasheets Get Quote image


References
  1. Markoff, John, “Smarter Than You Think: Google Cars Drive Themselves, in Traffic,” The New York Times, Oct 9, 2010.
  2. Schwarz, Brent, “LIDAR: Mapping the world in 3D,” Nature Photonics, July 2010, pg 429.
  3. Blind Spot Warning (BSW) System/ Blind Spot Intervention (BSI) System,” YouTube.

For More Information
    
Analog Devices
Apple Computer
BMW                 
Ford       
Fraunhofer Institute
Google
Infiniti
Mercedes-Benz 
Motorola 
Osram Opto Semiconductors
Stanford Artificial Intelligence Laboratory
Toyota
Velodyne
  


Editor's note: The original version of this article contained an error, which has been corrected in Figure 3 above and in the associated PDF file. "AFE: Analog front end" was changed to "AAF: Anti-aliasing filter" on May 26, 2011.

Loading comments...

Write a Comment

To comment please Log In

FEATURED RESOURCES