How Automated Vehicles See What's Around Them
Driver-assistance features such as adaptive cruise control, automated emergency braking, lane centering, and blind-spot monitoring—which are the building blocks of hands-free driving systems—require a vehicle to be able to "see" its surroundings. This artificial vision is accomplished using a variety of sensor types, and the implementation varies among automakers. In this primer, we explain how each category of sensors works and what it's used for. You may never look at your car looking at you the same way again.
Radar
Radio detection and ranging, better known as radar, uses electromagnetic waves to locate objects and determine their speed. A radar sensor is both a transmitter and receiver. The time it takes a wave to hit an object and return gives its distance, while a change in the wave's frequency during its journey is used to determine the object's speed. Unlike the rotating radar transmitter/receivers used in air traffic control, for instance, automotive radar units are fixed. They're usually found on a vehicle's front bumper for adaptive cruise-control systems or automated emergency braking, and behind a rear quarter-panel for blind-spot monitoring to detect other vehicles, and they can fake out less intelligent radar detectors. Radar can even be tuned for short- or long-range detection.
Sonar
Sonic navigation and ranging technology, also called ultrasound or ultrasonic sensors, has been used in parking sensors for decades. Sonar sensors appear on cars as little circles in the bumpers or fenders. This is another transmitter/receiver device; it emits sound waves that bounce back when they hit something, and the waves' time of flight along with the speed of sound gives the object's distance. Passive sonar, the systems pinging in all of those submarine movies, just listen, while automotive applications actively send out pulses.
Sonar's strength is in short-range detection. By itself, a single sonar sensor can only tell you if there's something in its field of view and how far away it is; there's nothing about the shape of the object or its speed. Placing a few of them along a vehicle's front or rear end helps give some more detail about what you're about to drive or back into, hopefully before it's too late. Side-facing sensors can detect open spaces for automated parking systems.
Ultrasonic sensors have found their way into vehicle interiors recently. Gesture-control systems—the gimmicky features that let you adjust the infotainment system's volume, for instance, by twirling your finger in the air—use ultrasonics to sense inputs. We wouldn't classify this as a driver-assistance feature, though.
Cameras
Video feeds are used for all sorts of driver-assistance features, including automated emergency braking, lane-keeping assist, and adaptive cruise control. A single camera provides a 2D image that a computer can use to detect things like lane lines and traffic signs as well as pedestrians or cyclists. Cameras can be set up to see nearby or far away.
Novel arrangements of multiple cameras provide depth perception just like a pair of human eyes does. Subaru's aptly named EyeSight system uses a pair of cameras in such an arrangement as an inexpensive alternative to radar.
Cameras are also employed inside the vehicle for driver monitoring. These systems generally are paired with infrared emitters, so the cameras can "see" at night, to determine whether the driver is paying attention to the road. A similar setup with an infrared-assisted camera pointed outward can also provide night vision on the exterior of a vehicle to help see people and animals in the dark. Direct sunlight, meanwhile, can cause issues for vision-based sensors.
One thing to keep in mind about cameras: They're often packaged behind the windshield or rear glass of a vehicle, so any replacement of that particular pane can become a minor pain. Be sure whoever replaces the glass can also perform the necessary calibration.
Lidar
Radar and sonar's light-emitting cousin is one of the newer additions to the automotive sensing arsenal. Lidar sends out pulses of light, often using light-safe laser, that map surroundings in the form of a point cloud. The time it takes for a pulse to return, along with the speed of light, provides the distance component, while the change in distance from one scan to the next can be used to determine an object's speed. The result is a sort of topographical map of what's around the vehicle.
Expensive spinning lidar units can be found atop and around Level 4 autonomous vehicles, such as those from Cruise and Waymo. More affordable static sensors, which have a smaller field of view, have started showing up on some series-production cars, including offerings from Audi, Lexus, and Mercedes.
In addition to being used on-vehicle, lidar sensors help make the HD maps that are used by Level 2 and Level 3 automated systems to more precisely determine location relative to fixed references, which in turn tells them where lanes and other attributes are. Relying on this previously mapped data saves the high cost of having a lidar sensor built into the vehicle.
Sensor Fusion
You can think of an automated vehicle's subsystems like your favorite baseball squadron or a group of specialists. They each have their strengths and weaknesses.
The concept of sensor fusion is where multiple kinds of sensors (called "modalities" in the biz) work together to overcome their individual shortcomings. Sensor fusion also provides some redundancy—if one sensor doesn't see something, or is acting up, another can.
As automakers continue to develop complex automated systems with more functionality, sensor fusion is becoming increasingly common. The notable exception is Tesla, which has moved to a camera-only solution for its Autopilot and Full Self-Driving features. Expect other automakers to continue adding more and different types of sensors as progress continues.
You Might Also Like