Autour

Autour - is an eyes-free mobile system designed to give blind users a better sense of their surroundings. The goal of Autour is to use audio to reveal the kind of information that visual cues such as neon signs provide to sighted users. Once users notice a point of interest, additional details are available on demand. The design employs spatialized audio rendering to convey relevant location based content to users, which may include information about their immediate surroundings, such as restaurants, cultural sites, public transportation and other points of interest.

As part of our development efforts, I had the opportunity to collaborate with Dalia El-ShimyFlorian Grond and Jeff Blum in the design of a technique to represent points of interest surrounding the user through spatialized audio. To test our model, an experiment was conducted comparing two methods of rendering the spatialized audio content (described in  El-Shimy et al. 2011). Ten visually impaired users were asked to navigate around a map by moving their finger along the surface of an iPod Touch, and position themselves as accurately as possible between two target locations. Users found the spatialized audio content useful in helping them locate various places of interest on a map, and expressed a strong interest in using Autour in the future to help them gain awareness of their surroundings.  

For this project we also run a quick ethnographic study with the blind community, design testing protocols and run usability test sessions in the streets of Montreal.  Some of the usability testing sessions (done in collaboration with Sabrina Panëels) examined Autour as a stand-alone mobile device application.  This helped us to refine various aspects of the system including the gestural interactions and the types of auditory display used to describe places of interest. To me, the following quote from one of our participants captures what Autour represents to a blind person, and also reflects some of the challenges addressed in future iterations:  

"I love the fact that there is a lot of information that I am missing without it. I think in the future if I am going to explore a new area, I definitely what to know what is around me, so that I know where to go… like everybody else, you know…"

"I felt a bit more pressure at the street crossings… but in terms of walking, … with it, I will feel more at ease because I know where I am, and would encourage me to walk more..."

 

This project has been developed at the Shared Reality Lab, lead by Prof. Jeremy R. Cooperstock at McGill University. The project is funded primarily by the Secrétariat du Conseil du trésor of Québec through its "Appui au passage à la société de l'information" with extensions supported by a Google Faculty Research Award.

Publications

Panëels, S.; Olmos, A.; Blum, J.; and Cooperstock, J. R. (2013).  “Listen to It Yourself! Evaluating Usability of “What’s Around Me?” for the Blind”, In Human Factors in Computing Systems (CHI).

El-Shimy, D., Grond, F., Olmos, A., Cooperstock, J., (2011), "Eyes-Free Environment Awareness for Navigation", Springer Journal on Multimodal User Interfaces, Special Issue on Interactive Sonification.