Using qualitative eye-tracking data to inform audio
presentation of dynamic Web content.
A. Brown, C. Jay, and S. Harper. New Review of Hypermedia and Multimedia, 16 (3):
281--301(2010)
Abstract
Presenting Web content through screen readers can be a challenging task, but this is the only means of access for many blind and visually impaired users. The difficulties are more acute when the information forms part of an interactive process, such as the increasingly common "Web 2.0 applications". If the process is to be completed correctly and efficiently it is vital that appropriate information is given to the user at an appropriate time. Designing a non-visual interface that achieves these aims is a non-trivial task, for which several approaches are possible. The one taken here is to use eye-tracking to understand how sighted users interact with the content, and to gain insight into how they benefit from the information, then apply this understanding to design a non-visual user interface. This paper describes how this technique was applied to develop audio interfaces for two common types of interaction: auto-suggest lists and pop-up calendars. Although the resulting interfaces were quite different, one largely mirroring the visual representation and the other not, evaluations showed that the approach was effective, with both audio implementations effective and popular with participants.
%0 Journal Article
%1 nrhm2010
%A Brown, Andrew
%A Jay, Caroline
%A Harper, Simon
%D 2010
%J New Review of Hypermedia and Multimedia
%K Accessibility eye-tracking impaired visually
%N 3
%P 281--301
%T Using qualitative eye-tracking data to inform audio
presentation of dynamic Web content.
%U http://www.informaworld.com/10.1080/13614568.2010.542253
%V 16
%X Presenting Web content through screen readers can be a challenging task, but this is the only means of access for many blind and visually impaired users. The difficulties are more acute when the information forms part of an interactive process, such as the increasingly common "Web 2.0 applications". If the process is to be completed correctly and efficiently it is vital that appropriate information is given to the user at an appropriate time. Designing a non-visual interface that achieves these aims is a non-trivial task, for which several approaches are possible. The one taken here is to use eye-tracking to understand how sighted users interact with the content, and to gain insight into how they benefit from the information, then apply this understanding to design a non-visual user interface. This paper describes how this technique was applied to develop audio interfaces for two common types of interaction: auto-suggest lists and pop-up calendars. Although the resulting interfaces were quite different, one largely mirroring the visual representation and the other not, evaluations showed that the approach was effective, with both audio implementations effective and popular with participants.
@article{nrhm2010,
abstract = {Presenting Web content through screen readers can be a challenging task, but this is the only means of access for many blind and visually impaired users. The difficulties are more acute when the information forms part of an interactive process, such as the increasingly common "Web 2.0 applications". If the process is to be completed correctly and efficiently it is vital that appropriate information is given to the user at an appropriate time. Designing a non-visual interface that achieves these aims is a non-trivial task, for which several approaches are possible. The one taken here is to use eye-tracking to understand how sighted users interact with the content, and to gain insight into how they benefit from the information, then apply this understanding to design a non-visual user interface. This paper describes how this technique was applied to develop audio interfaces for two common types of interaction: auto-suggest lists and pop-up calendars. Although the resulting interfaces were quite different, one largely mirroring the visual representation and the other not, evaluations showed that the approach was effective, with both audio implementations effective and popular with participants.},
added-at = {2015-06-10T17:08:19.000+0200},
author = {Brown, Andrew and Jay, Caroline and Harper, Simon},
biburl = {https://www.bibsonomy.org/bibtex/21430c737f783cc66feca8eb82f8d4046/oulu2015},
interhash = {e9dec7e851ade23cf4f204231cdcb0b1},
intrahash = {1430c737f783cc66feca8eb82f8d4046},
journal = {New Review of Hypermedia and Multimedia},
keywords = {Accessibility eye-tracking impaired visually},
number = 3,
pages = {281--301},
timestamp = {2015-06-10T17:43:28.000+0200},
title = {Using qualitative eye-tracking data to inform audio
presentation of dynamic Web content.},
url = {http://www.informaworld.com/10.1080/13614568.2010.542253},
volume = 16,
year = 2010
}