Abstract
In this paper we demonstrate how foot gestures can be used to perform navigation tasks in interactive 3D environments and how a World-In-Miniature view can be manipulated trough multi-touch gestures, simplifying the way-finding task in such complex environments. Geographic Information Systems (GIS) are well suited as a complex test-bed for evaluation of user interfaces based on multi-modal input. Recent developments in the area of interactive surfaces enable the construction of low-cost multi-touch sensors and relatively inexpensive technology for detecting foot gestures allows exploring these input modalities for virtual reality environments. In this paper, we describe an intuitive 3D user interface setup, which combines multi-touch hand and foot gestures for interaction with spatial data.
Users
Please
log in to take part in the discussion (add own reviews or comments).