@alex_szykman

Real-time Perception-level Translation from Audio Signals to Vibrotactile Effects

, and . Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, page 2567--2576. New York, NY, USA, ACM, (2013)
DOI: 10.1145/2470654.2481354

Abstract

In this paper, we propose a real-time perception-level audio-to-vibrotactile translation algorithm. Unlike previous signal-level conversion methods, our algorithm considers only perceptual characteristics, such as loudness and roughness, of audio and tactile stimuli. This perception-level approach allows for designing intuitive and explicit conversion models with clear understandings of their perceptual consequences. Our current implementation is tailored to accurate detection of special sound effects to provide well-synchronized audio-tactile feedback in immersive applications. We also assessed the performance of our translation algorithm in terms of the detection rate of special sound effects, computational performance, and user preference. All the experimental results supported that our algorithm works well as intended with better performance than the signal-level conversion methods, especially for games. Our algorithm can be easily realized in current products, including mobile devices, gaming devices, and 4D home theater systems, for richer user experience.

Links and resources

Tags

community

  • @dblp
  • @alex_szykman
@alex_szykman's tags highlighted