VEMI Lab researchers earn federal prize

Invite to White House for software that makes self-driving cars more accessible

The VEMI lab at the University of Maine earned third place in a national competition and an invitation to the White House for developing an inclusive smartphone software platform that will provide navigational assistance to people with visual impairment and seniors who want to use self-driving cars for ride-sharing and hailing services.  

VEMI will receive $300,000 for winning the prize in the second phase of the U.S. Department of Transportation’s Inclusive Design Challenge for its Autonomous Vehicle Assistant (AVA) smartphone technology. VEMI leads the group designing the AVA platform, known as the Autonomous Vehicle Research Group (AVRG), which also includes collaborators from Northeastern University and Colby College. 

For its challenge, the DOT sought proposals for accessible and inclusive design solutions that would help people with disabilities use autonomous vehicles for employment and essential services. VEMI was invited to participate in Stage II of the challenge after being named a semifinalist in the first phase.  

First- and second-place winners in the competition were Purdue University and AbleLink Smart Living Technologies, respectively.

The prizes in the Inclusive Design Challenge were announced July 26 as part of the DOT’s celebration for the 32nd Anniversary of the Americans with Disabilities Act, known as the ADA. VEMI director Richard Corey and chief research scientist Nicholas Giudice are participating in a ceremony at the department’s headquarters in Washington, D.C., followed by a policy session hosted by the Office of Public Engagement and Office of Science and Technology Policy July 27 at the White House. 

“This is exciting national recognition of one of the outstanding, ongoing research and development initiatives from our VEMI Lab,” says UMaine President Joan Ferrini-Mundy. “Such innovation addressing needs, including those for equity and inclusion, and providing critical advancements in technology are among our R1 research university achievements.”

“The whole VEMI Lab team is ecstatic to receive the support and national recognition for our innovation,” Corey says. “This prize will help support our students who are leading the charge in human vehicle collaboration research and accessible technology at VEMI. Working with the IDC team at the U.S. DOT has been a delight and we are deeply honored to have been selected as top tier winners.” 

Designing the project and earning the prize for it was a collaborative effort from VEMI staff, students and external partners, specifically Grant Beals, Paul Fink, Aubree Nygaard, and Raymond Perry from VEMI; Xue (Shelley) Lin, Pu Zhao and Yushu Wu from Northeastern University; and Stacy Doore and Matthew Maring from Colby College. 

AVA will help users request a vehicle, find it, enter it, exit it and travel to their chosen destination. It provides a multisensory interface that offers users guidance through audio and haptic, or touch-based, feedback and high-contrast visual cues. To provide the functionality, researchers utilized GPS, LiDAR, Gyroscope and Accelerometer technology; real-time computer vision via the smartphone camera; machine learning; artificial intelligence and other software. 

Users will create a profile in AVA that reflects their accessibility needs and existing methods of navigation so the software can find suitable transportation for them. When the vehicle arrives, AVA will guide the user to it using the camera and augmented reality (AR), which provides an overlay of the environment by superimposing high-contrast lines over the image on the smartphone screen to highlight the path, and verbal guidance such as compass directions, street names, addresses, nearby landmarks and other indicators. The software also will pinpoint environmental hazards, including low-contrast curbs, traffic cones and overhanging obstructions like branches and guy wires, by emphasizing them with contrasting lines and vibrations when users approach them. It will then help users find the door handle to enter the vehicle awaiting them. It also uses the same functions to help the user when exiting their vehicle to find their destination. 

AVA will offer accessible modules with simulations that train users not only how to use the application, but also training on how to interact with ride sharing and hailing services with self-driving vehicles when a person is no longer available to provide assistance. For future projects, researchers plan to develop additional software that will allow riders to use the technology to connect with the vehicle control systems while riding. These tools will include multisensory maps, context-aware gesture interactions and application programming interfaces, all of which will support in-cabin accessibility. 

Watch the video presentation about AVA prepared by VEMI Lab researchers for the DOT’s Inclusive Design Challenge to learn more. 

“Autonomous vehicles have the potential to be a truly game changing, disruptive technology for improving accessible, inclusive transportation for people with visual impairments and older adults,” says Giudice, also a UMaine professor of spatial computing and congenitally blind. “However, to succeed, there are a lot of challenges to overcome first. Our initial research and development of AVA in the first IDC semi-finalist round has made significant progress in addressing current limitations, but I am most excited about our future development made possible by this finalist IDC prize, which will lead to a robust, end-to-end inclusive travel solution that integrates with other accessible apps and platforms.” 

The AVA project builds on a National Science Foundation grant led by Giudice and Corey on trust building and human-vehicle collaboration with autonomous vehicles, as well as a seed grant-funded, joint effort between UMaine and Northeastern University to improve accessibility, safety and situational awareness within self-driving vehicles. Research on both projects aims to develop a new model of human-AI vehicle interaction to ensure people with visual impairments and seniors can better understand what their autonomous vehicle does during their travels, and so the vehicles can effectively communicate with them — work that will be instrumental for informing future AVA development on this Inclusive Design Challenge prize. 

AVA serves as one example of the broad AI, computing and information systems research VEMI scientists and others are conducting at UMaine. Their work exemplifies the research and public service missions of the top-tier R1 university, a designation UMaine earned earlier this year from the Carnegie Classification of Institutions of Higher Education. 

VEMI, co-founded by Corey and Giudice in 2008, explores different solutions for solving unmet challenges with technology. Prime areas of research and development pertain to self-driving vehicles, the design of bio-inspired tools to improve human-machine interaction and functionality, and new information-access technology to improve inclusive environmental awareness, spatial learning and wayfinding for both sighted and visually impaired navigators.  

Contact: Marcus Wolf, 207.581.3721; marcus.wolf@maine.edu