Self-driving cars lack the understanding of social cues in traffic that let human drivers decide whether to give way or drive on, a recent study finds.
Autonomous driving research has focused largely on ways in which self-driving machines can analyze what other vehicles are doing. However, traffic has long been a social domain, which suggests that a lack of understanding of social interactions may lead to problems for autonomous vehicles.
“Self-driving cars are causing traffic jams and problems in San Francisco because they react inappropriately to other road users,” study lead author Barry Brown, a research professor of of human-computer interaction at Stockholm University and a professor of human-centered computing at the University of Copenhagen, said in a statement.
To learn more about the actual road performance of autonomous vehicles, Brown and his colleagues performed in-depth analyses of 18 hours of video footage of Waymo and Tesla self-driving cars from 70 different YouTube videos filmed by enthusiasts testing the vehicles from the back seat. They focused on one core element of social interactions on the road — when people yield to each other and when they drive on.
The scientists found that self-driving cars have a particularly tough time understanding when to halt.
“The ability to navigate in traffic is based on much more than traffic rules,” Brown said in a statement. “Social interactions, including body language, play a major role when we signal each other in traffic. This is where the programming of self-driving cars still falls short. That is why it is difficult for them to consistently understand when to stop and when someone is stopping for them, which can be both annoying and dangerous.”
One video example showed a family of four standing by the curb of a residential street in the United States. There is no pedestrian crossing, but the family would like to cross the road. As the driverless car approaches, it slows, causing the two adults in the family to wave their hands as a sign for the car to drive on. Instead, the car stops right next to them for 11 seconds. Then, as the family begins walking across the road, the car starts moving again, causing them to jump back onto the sidewalk, whereupon the person in the back seat rolls down the window and yells, “Sorry, self-driving car!”
“The situation is similar to the main problem we found in our analysis and demonstrates the inability of self-driving cars to understand social interactions in traffic,” Brown said in a statement. “The driverless vehicle stops so as to not hit pedestrians, but ends up driving into them anyway because it doesn’t understand the signals. Besides creating confusion and wasted time in traffic, it can also be downright dangerous.”
In another instance, a self-driving car at an intersection lets five cars go before it even though it had the right of way. In that case, the autonomous vehicle did not understand the importance of the timing of the interactions, such as how the slowing of the cars was meant to convey an intention to yield, Brown explained in a presentation.
One reason that it may be proving so difficult to program self-driving cars to understand social interactions in traffic “is that we take the social element for granted,” Brown said in a statement. “We don’t think about it when we get into a car and drive — we just do it automatically.”
However, when it comes to designing autonomous driving systems, “you need to describe everything we take for granted and incorporate it into the design,” Brown said in a statement. “The car industry could learn from having a more sociological approach. Understanding social interactions that are part of traffic should be used to design self-driving cars’ interactions with other road users, similar to how research has helped improve the usability of mobile phones and technology more broadly.”
The scientists detailed their findings in April at the 2023 CHI Conference on Human Factors in Computing Systems in Hamburg, Germany, where it won the conference’s best paper award.