Why Self-Driving Cars Need Superhuman Senses – WIRED
More than any other benefit, self-driving vehicles promise to save lives. Cutting out the human error that causes 90 percent of crashes could start to save some of the 35,000 lives lost on American roads every year. Manufacturers are convinced that people will happily use at least partially autonomous cars when they’re proven to be safer than human drivers, but that’s a pretty low bar. The ultimate goal is to eliminate crashes all together, and to do that, cars will need to perfectly perceive and understand the world around them—they’ll need superhuman senses.
Pretty much every AV now in testing uses some combination of cameras, radars, and lidar laser systems. But now, an Israeli startup wants to add a new tool to the mix: heat-detecting infrared cameras that can pick out pedestrians from hundreds of feet away.
A fully driverless car, after all, will need to see the world in a wide variety of lighting and weather conditions. “Existing sensors and cameras available today can’t meet this need on their own,” said AdaSky CEO, Avi Katz, in a statement. So this morning, his company announced its plan to offer automakers what it calls Viper, a long distance infrared camera and accompanying computer vision system.
Today’s sensors offer a detailed view of the world in 360 degrees, but each has its weak points. Cameras don’t work well at nighttime, or in dazzling sunlight. Lidar has trouble with rain, fog, and dust, because the laser bounces off the particles in the atmosphere. Radar can be confused by small but highly reflective metal objects, like a soda can in the street.
Even systems that combine data from all three sensors can struggle with images of humans on billboards, or on adverts on other vehicles, as recently shown by Cognata, which simulates training environments for driverless car brains. That’s where AdaSky thinks its sensor can pitch in. If a human-shaped object is giving off heat, it’s probably a real person, not a picture.
SELF DRIVING STORIES
“When we have heat radiating from something, and you figure out that it’s a person or an animal, then that tells you there’s the potential for unpredictable behavior,” says Jeff Miller who studies autonomous vehicles at USC. Instead of just knowing there’s an object on the right hand side of the road, a car that perceived that it was a deer would proceed more cautiously.
(The system might even help cars figure out those kid-shaped, terribly-dressed bollards that recently showed up in the UK.)
Perception isn’t just about sight, or even the outside world. Waymo, née Google’s self driving project, recently announced it now uses upgraded microphones to listen for police sirens, for example. Cadillac stuck an infrared camera on the steering wheel to monitor the driver’s state of awareness when its cars are in semi-autonomous mode.
This process of finding the right mix of sensors will likely never stop evolving, as new technologies become available and car companies puzzle over factors like cost, availability, and durability. Because until the day that cars are 100 percent safe, passengers will have to be convinced that the vehicle they’re climbing into is at least better at handling any situation than a human driver. And when it comes to that making cars superhuman, the better they see, the better they drive.