@hci-uwb

A User Interface Framework for Multimodal VR Interactions

. Proceedings of the ACM seventh International Conference on Multimodal Interfaces, ICMI 2005, page 76-83. ACM, (2005)

Abstract

This article presents a User Interface (UI) framework for multimodal interactions targeted at immersive virtual environments. Its configurable input and gesture processing components provide an advanced behavior graph capable of routing continuous data streams asynchronously. The framework introduces a Knowledge Representation Layer which augments objects of the simulated environment with Semantic Entities as a central object model that bridges and interfaces Virtual Reality (VR) and Artificial Intelligence (AI) representations. Specialized node types use these facilities to implement required processing tasks like gesture detection, preprocessing of the visual scene for multimodal integration, or translation of movements into multimodally initialized gestural interactions. A modified Augmented Transition Nettwork (ATN) approach accesses the knowledge layer as well as the preprocessing components to integrate linguistic, gestural, and context information in parallel. The overall framework emphasizes extensibility, adaptivity and reusability, e.g., by utilizing persistent and interchangeable XML-based formats to describe its processing stages.

Links and resources

Tags

community

  • @mfischbach
  • @hci-uwb
  • @dblp
  • @marcerich
@hci-uwb's tags highlighted