LAS VEGAS—Automobiles and trucks are increasingly equipped with a variety of sensors, many of which can be harnessed to aid in navigation for autonomous vehicles, Hexagon officials said at the recent HxGN Live event here.
While the classic pairing of GNSS/INS can reliably guide vehicles most of the time, there are instances when GNSS isn’t available, leaving INS to carry the load, said Terry Lamprecht, portfolio manager of third-party sensors at Hexagon.
“GNSS struggles sometimes, like in parking structures, near big buildings, like in Vegas,” he said. Inertial guidance can handle the job when GPS or other satellite signals aren’t available, but its drifts over time and errors can be substantial, up to 15 meters, which wouldn’t work on most roads.
That’s where other sensor suites on board cars and trucks can step in. Cameras can also be used, Lamprecht said, and are “great, cost effective, but they struggle. Lighting is a big concern, it has to be illuminated well for them to work,” and they have trouble determining how far away potential obstacles are.
Cameras and other sensors can provide 360-degree coverage around a vehicle from radar, LiDAR and optical sensors. There are other sensors as well, including wheel speed sensors, electronic stability control, steering angle sensors, throttle pedal sensors, anti-lock brakes, even ground-penetrating radar, all of which can feed information to the INS to aid in positioning.
“Our approach to sensor fusion is that IMU [inertial measurement unit]. is at the heart. IMU always available. Every other sensor will have periods where it’s unavailable,” said Ryan Dixon, senior manager of sensor fusion and autonomy. “You take that core engine that’s really well established, GNSS/INS, and feed in the others.”
All those other sensors need to have a common reference point, such as the middle of an axle, and need to be time synchronized. If the vehicle gets bumped in an accident, they would all need to be recalibrated.
Hexagon did some testing in its sensor-equipped Lexus RX450h, including by taking GNSS away for up to half an hour, which Dixon said is a “ridiculous cases of taking GNSS away, way longer than you would normally see.”
INS by itself was “quite far off,” he said. Adding in a single odometer reduced the error rate, but adding in data from the vehicle’s sensors reduced it even more. “It makes a pretty big difference just to add the stuff that’s already on the vehicle.”
Adding LiDAR and radar, including ground-penetrating radar for mapping “vastly improved INS performance,” he said. Adding a network of radio frequency transmitters around the test parking lot mean the vehicle could operate without GNSS indefinitely.
LiDAR mapping is very precise, Lamprecht said, and radar is getting better, so that “we’re starting to be able to make out features like with LiDAR.”
Ground-penetrating radar has benefits as well, as it’s picking up infrastructure and other features under the surface that typically don’t change.
Nixon said at the moment, some of the sensors Hexagon is working with are more likely to be found on off-road vehicles; but the company has repeatedly said the road to autonomy will be off-road initially.