Abstract
In this paper, we adapt an existing VR framework for handwriting and sketching on physically aligned virtual surfaces to AR environments using the Microsoft HoloLens 2. We demonstrate a multimodal input metaphor to control the framework’s calibration features using hand tracking and voice commands. Our technical evaluation of fingertip/surface accuracy and precision on physical tables and walls is in line with existing measurements on comparable hardware, albeit considerably lower compared to previous work using controller-based VR devices. We discuss design considerations and the benefits of our unified input metaphor suitable for controller tracking and hand tracking systems. We encourage extensions and replication by providing a publicly available reference implementation (https://go.uniwue.de/hci-otss-hololens).
Users
Please
log in to take part in the discussion (add own reviews or comments).