Abstract

Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze in an effective flow of communication. Recent initiatives such as perceptual and attentive user interfaces put these natural human behaviors in the center of the human-computer interaction (HCI). We've designed a generic modeling framework for specifying multimodal HCI using the Object Management Group's Unified Modeling Language. Because it's a well-known and widely supported standard --- computer science departments typically cover it in undergraduate courses, and many books, training courses, and tools support it --- UML makes it easier for software engineers unfamiliar with multimodal research to apply HCI knowledge, resulting in broader and more practical effects. Standardization provides a significant driving force for further progress because it codifies best practices, enables and encourages reuse, and facilitates interworking between complementary tools.

Links and resources

Tags