Self-driving cars may one day navigate safely in bad weather with the help of a new LiDAR-like radar system.
To navigate, autonomous vehicles often rely on LiDAR, which works by bouncing laser beams off surrounding objects. Although LiDAR can draw high-resolution 3D pictures on a clear day, it cannot see in fog, dust, rain and snow.
In contrast, radar can see in all weather, and is significantly cheaper. Automotive radar sensors currently help drivers avoid collisions and detect pedestrians and cyclists.
However, when radar sensors emit radio waves, only a small fraction of these signals ever get reflected back to the sensor. As such, pedestrians, vehicles and other surrounding objects only appear as sparse collections of points, leading to relatively poor image quality.
Now scientists have upgraded how well radar sees using existing radar technology. The key is using multiple radar sensors to increase the number of signals reflected back to them. They found the optimal arrangement for autonomous vehicles consisted of two radar sensors placed 1.5 meters apart on the hood of a car. By having two radar sensors at different vantage points, this configuration helps detect objects within overlapping fields of view.
“We’re not using one high-beam light, but multiple low-beam lights to better light up objects,” said researcher Dinesh Bharadia at the University of California, San Diego. “That eliminates a whole lot of blindness.”
The new system overcomes another common problem with radar — noise. In radar images, random points that do not belong to any objects are commonly seen, often due to echo signals — that is, radio waves that get reflected off the ground, walls or other objects before sensors receive them. Since more radar sensors mean more noise, the researchers developed new algorithms that could compare information from the two different radar sensors to produce a new image free of noise.
During test drives on clear days and nights, the new radar system performed as well as a LiDAR sensor at determining the size and shape of cars moving in traffic. In a test simulating foggy weather, the system accurately predicted the 3D geometry of another vehicle hidden from view using a fog machine, while the LiDAR sensor essentially failed the test.
“When we started this work, we didn’t expect radars to work that well,” Bharadia said.
The researchers are now working with Toyota to combine their new radar system with cameras, an approach they suggest could one day replace expensive LiDAR systems. “Cameras and radars are sensors that are already becoming increasingly widely deployed on cars, and are also low-cost and well-engineered,” Bharadia said. “These two sensing modalities are complementary—cameras can help characterize the context of a scene, whereas radar can tell you where other things are.”
The scientists detailed their findings at the SenSys conference on Nov. 18.
Photo credit: Kshitiz Bansal.