Self-driving car pushes sensor technology
The recent trials of the Google Self-driving Car and the associated passage of regulations to allow them on the roads in Nevada have created a new market for high-density sensor subsystems in the automobile sector. These vehicles rely on an increasing number of sensors to perform the tasks of the single driver.
The Google test cars use a centrally mounted LIDAR system with front- and rear-mounted vision systems. These are then incorporated with existing RADAR and ultrasonic-sensor systems that are already a part of contemporary vehicles to determine the full spatial positioning of the car. (Figure 1)
Unfortunately, as is visible in the figure, these systems do not follow the aesthetic guidelines that have ruled the automobile industry since its inception. As a result, the sensors will need to be re-engineered into both smaller form factors and also as distributable modules. Under a distributable system, since real-time response is needed; the network structure for managing the data exchange is going to be key.
Most of the sensor systems today have parallel data output modules. These systems generally have short-run-length data paths and are directly processed by a local MCU or CPU. For the self-driving vehicle, this distributed processing also has to be centrally managed to accommodate decisions and conditions that involve more than one system. As a result, the data is shifting towards a serial data interface over one of the existing in-car wiring and networking protocols.
The challenges for these systems include the congestion on the current wiring networks and the system noise that is generated by both the SERDES for pin reduction and the multiple sensors in a single die or package. Vertical integration and stacked die have been successfully employed in MEMs devices to move from 3-axis to 9-axis positioning devices in a single package, and for these new imaging electronics this may be the direction they have to follow as well.
These are MEMs devices however are a single location device. Each system for the driver-assistance logic that uses these devices has a separate sensor in a new location. This method works well for localized closed loop For full autonomous control of the vehicle these sensors, as well as the distributed imaging data has to be centralized in a single "driving control" computer system.
This imaging high-speed data is both larger than the infotainment AV in the cars and necessarily a higher priority, but may use the same network. Unfortunately, these driver-assistance and control functions are "assumed to be in place" for consumer automobiles, and are generally not the criteria for purchase of the vehicle. This results in the in-cabin comfort systems being of equal or higher priority to the buyer. This is driving the challenge of sensor integration, software testing and validation and the cost model for the components for being a "standard feature" rather than an "option" for the buyer.