Home On Road Smartphone App May Help Seniors, People With Disabilities Enjoy Robo-Taxis

Smartphone App May Help Seniors, People With Disabilities Enjoy Robo-Taxis

by Charles Choi

A new smartphone app under development may one day help seniors and people with disabilities enjoy rides with self-driving cars.

Self-driving cars promise access to autonomous taxis. However, many people with visual impairments who currently use ride-hailing and ride-sharing services rely on human drivers to safely escort them to their vehicles, and robo-taxis will likely not possess human drivers who can help serve as such guides.

Now scientists at the University of Maine are developing a smartphone app to help seniors and people with disabilities better enjoy ride-hailing and ride-sharing with both driverless and standard vehicles. The researchers are developing the app with teams at Northeastern University and Colby College using $300,000 in funding from the U.S. Department of Transportation awarded through its Inclusive Design Challenge

“I’m just excited in general about trying to make autonomous vehicles more accessible and inclusive,” said experimental psychologist Nicholas Giudice, founder and chief research scientist at the University of Maine’s Virtual Environments and Multimodal Interaction Laboratory, who himself is congenitally blind. “This work can really end up helping not just blind people and older people, but other people too. Instead of calling for an autonomous ride from a concert and not knowing which of 20 cars is yours since they all look the same, you can have them beep their horn or flash their lights.””

With the app, dubbed the Autonomous Vehicle Assistant (AVA), users will first create a profile that reflects their needs and existing methods of navigation. The app can then match them to a suitable vehicle for transport and determine if one is available.

When a vehicle arrives, by using GPS, AVA will guide the user to it using the smartphone camera and augmented reality (AR), which superimposes high-contrast arrows and lines over the smartphone’s display of the environment to highlight the path the user needs to take. AVA will also supply verbal guidance, such as compass directions, street names, addresses and nearby landmarks, and use smartphone vibrations to help lead users where they should go. “We want to use more multi-sensory information instead of just vision,” Giudice said.

The app will also pinpoint environmental hazards, such as hard-to-see curbs, by emphasizing them with contrasting lines and vibrating when users approach them. It will then use image-recognition algorithms to identify the handle on a vehicle’s door to help users enter the taxi.

“Instead of just providing guidance to go down the block, it can give landmarks based on the GPS location, so it can say the car is, for instance, out by the west entrance in front of the McDonald’s sign,” Giudice said. “We are also working on having the system talk with the car for the car to improve guidance once a person gets close enough to it.

The scientists are also investigating ways to make the cabins of autonomous vehicles more accessible to everyone. “Very often they’re touchscreen-based, which is obviously a problem for people who have vision loss like me,” Giudice said. “We want to see how voice and gesture interfaces can lead to more natural interactions, which is something that can help lots of people, but particularly under-represented groups often completely neglected in driving.”

You may also like