@alex_szykman

An Assistive EyeWear Prototype That Interactively Converts 3D Object Locations into Spatial Audio

, and . Proceedings of the 2014 ACM International Symposium on Wearable Computers, page 119--126. New York, NY, USA, ACM, (2014)
DOI: 10.1145/2634317.2634318

Abstract

We present an end-to-end prototype for an assistive EyeWear system aimed at Vision Impaired users. The system uses computer vision to detect objects on planar surfaces and sonifies their 3D locations using spatial audio. A key novelty of the system is that it operates in real time (15Hz), allowing the user to interactively affect the audio feedback by actively moving a headworn sensor. A quantitative user study was conducted on 12 blindfolded subjects performing an object localisation and placement task using our system. This detailed study of near field interactive spatial audio for users operating at around arm's length departs from existing studies focused on far-field audio and non-interactive systems. The object localisation accuracy achieved on naive users suggests that the EyeWear prototype has a lot of potential as a real world assistive device. User feedback collected from exit surveys and mathematical modelling of user errors provide several promising avenues to further improve system performance.

Links and resources

Tags

community

  • @dblp
  • @alex_szykman
@alex_szykman's tags highlighted