@flint63

Multimodal Language Processing for Mobile Information Access

, , , , and . Proceedings of ICSLP---Interspeech 2002: 7th International Conference on Spoken Language Processing, Denver, CO, USA, page 2237-2240. (2002)

Abstract

Interfaces for mobile information access need to allow users flexibility in their choice of modes and interaction style in accordance with their preferences, the task at hand, and their physical and social environment. This paper describes the approach to multimodal language processing in MATCH (Multimodal Access To City Help), a mobile multimodal speech-pen interface to restaurant and subway information for New York City. Finite-state methods for multimodal integration and understanding enable users to interact using pen, speech, or dynamic combinations of the two, and a speech-act based multimodal dialogue manager enables mixed-initiative multimodal dialogue.

Links and resources

Tags

community