Abstract
Disabled people typically use methods of `sensory translation' to access a Web-page via assistive technology. These technologies conventionally render screen content under the direction of the user into a form that can be perceived by that user -- in effect the interface and content are adapted to suit their sensory requirements -- but simple sensory translation is not enough.
Why is this -- and how can things be better? In this talk we touch on accessibility, sensory transcoding, multi-talker systems, auditory perception, and Neuroscience to help us in our search for equivalent interactive experiences tailored to the sensory modality of the user.
Users
Please
log in to take part in the discussion (add own reviews or comments).