Abstract
This dissertation investigates the feasibility and applicability of repurposing consumer-grade XR controllers as controller styluses and evaluates their impact on the performance and user experience of virtual tap and swipe keyboards and handwriting text input in XR environments. Text input is a core feature of many XR applications, enabling tasks such as documenting, note-taking, chatting, and web browsing. However, XR, encompassing VR, AR, and MR, presents distinct challenges that limit traditional text input methods like physical keyboards or handwriting with pen and paper. As an alternative, prior research explored virtual keyboards and handwriting text input in VR and OST AR, utilizing XR controllers held in the conventional power grip or hand tracking. Yet, fundamental research gaps remained. These include the feasibility and applicability of repurposing consumer-grade XR controllers as controller styluses by holding them in a pen-like posture, such as the precision grip, integrating diverse XR devices and input modalities, comparing the performance and user experience of text input methods in VR and VST AR, and understanding the impact of mid-air and physically aligned virtual surfaces. To address these gaps, this dissertation introduces the OTSS, a modular and extensible framework for repurposing consumer-grade XR controllers as controller styluses equipped with self-made or 3D-printed stylus accessories. OTSS also incorporates virtual-to-physical alignment and refinement techniques to align virtual surfaces to physical counterparts or freely place them in mid-air. Additionally, this dissertation presents the RSIO framework, an intermediate layer designed to simplify and unify cross-device and cross-platform XR application development. A series of user studies and technical evaluations demonstrate the applicability and versatility of the OTSS and RSIO frameworks. Building on these frameworks, two user studies involving a total of 136 participants provide detailed insights into the performance and user experience of virtual tap and swipe keyboards and handwriting text input in VR and VST AR. The findings underscore the potential of controller styluses for precise touch-based interaction on mid-air and physically aligned virtual surfaces, particularly when equipped with pressure-sensitive stylus tips for physical contact detection. Moreover, the results indicate that visual incongruencies are a distinct challenge in VST AR and suggest that while physical surfaces are desirable for text input in XR, they are not indispensable in mobile XR scenarios. Publicly available reference implementations are provided to establish a foundation for future research and the development of XR text input methods for professional, educational, and personal environments.
Links and resources
Tags