Autonomous vehicles have a lot to learn about the tricks of the on-road driving trade—but perhaps even more about the challenges of off-road driving. And the lessons learned may carry over from one environment to another.
Researchers from the Oxford Institute of Robotics (ORI) have loaded a Range Rover with sensors and cameras and taken it to the rugged Highlands, driving around Ardverikie Estate in central Scotland.
The vehicle equipped with vision, LiDAR and other sensors to gather data to test algorithms for localization and perception for autonomy in challenging environments.
“Nearly all the main driving data sets are in towns and cities,” said Oliver Bartlett of ORI, “and you can train your car to drive those. But if you take them outside that environment, you’ve got no idea how they are going to perform.”
“A lot of the time we assume that the robot moves on a flat world, like a two-dimensional plane,” said Matthew Gadd, one of the researchers on the Sense-Assess-eXplain (SAX) project. “All those assumptions just break down when you’re going up a scree, or down a muddy slope.”
Weather can also interfere with autonomous vehicle perception and performance. “When you are out and about in a car in the Highlands, it looks very different to the south of England and what the environment looks like is also useful for us,” add Gadd.
The SAX vehicle “sense[s] the world through a set of unconventional but complementary sensors – including Frequency-Modulated Continuous-Wave (FMCW) scanning radar and acoustic sensors – that will allow us to perceive and interpret the environment in novel ways beyond the current state-of-the-art. These alternative sensing methods will allow us to make robust perceptions where traditional sensing modalities might fail under severe weather conditions. We take the view that these new additional modalities, so rarely used, offer both an axis of assurance and validation viz-a-viz conventional established techniques and an expansion of the operating envelope. In particular, we focus on the perception of driving surfaces in on-road and off-road scenarios under various weather (including torrential rain and snow) and lighting conditions using radar as well as the interpretation of complex, unstructured environments using auditory sensing. Finally, to increase the vehicle’s environmental awareness we sense the environment through a set of available data services such as rain radar (from weather services) and satellite imagery.”
— from “Sense–Assess–eXplain (SAX): Building Trust in Autonomous Vehicles in Challenging Real-World Driving Scenarios, by Matthew Gadd, Daniele De Martini1, Letizia Marchegiani, Paul Newman and Lars Kunze. In Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Workshop on Ensuring and Validating Safety for Automated Vehicles (EVSAV), 2020.
The SAX project is part of the Assuring Autonomy International Programme (AAIP), which is funded by the Lloyd’s Register Foundation. It is led by Professor Paul Newman and Dr Lars Kunze, of ORI. A subsidiary of Oxford University’s Department of Engineering Science, ORI comprises collaborating and integrated groups of researchers, engineers and students dedicated to researching robotics, artificial intelligence, systems engineering, and other related fields.