are users the gold standard for accessibility evaluation?
DESCRIPTION
User testing is considered a key part of web accessibility evaluation. However, little is known about how effective is for identifying accessibility problems. Our experience, informed by a series of studies with blind users, corroborates that a website with a significant number of guideline violations can be perceived as accessible, and on the contrary, some participants may not perceive a highly accessible website as accessible. Accessibility guidelines are often criticised by their partial coverage and questionable validity. However, we should be very careful about making categorical statements in this regard as there are a number of variables that may introduce biases in user tests. We identify sources of bias related to user expertise, the experimental setting, employed language and reporting that, if not adequately controlled, may influence on the validity and reliability of the evaluation results. We discuss the limitations and practical implications of user testing with blind users for web accessibility evaluation.TRANSCRIPT
Are Users the Gold Standard
for Accessibility Evaluation?
7th April - W4A’14
Amaia Aizpurua1, Myriam Arrue Simon Harper2, Markel Vigo3
U. of the Basque Country University of Manchester
http://dx.doi.org/10.1145/2596695.2596705
1: @amaiaaizpurua 2: @sharpic3: @markelvigo
Motivation
• User testing for accessibility evaluation
• Encouraged by the community
• No established procedures
– What?
– When?
– How?
27th April - W4A’14 Are Users the Gold Standard for Accessibility Evaluation?
In practice
• Evaluating web accessibility for blind users
– User testing method• Effectiveness in identifying accessibility problems
[Mankoff et al. 2005]
– Accessibility guidelines• Partial coverage of user problems
[Power et al. 2012]
37th April - W4A’14 Are Users the Gold Standard for Accessibility Evaluation?
Study results
• Accessibility: conformance vs. perceived
Website AA conformance level Perceived
47th April - W4A’14 Are Users the Gold Standard for Accessibility Evaluation?
Median Mode SD
6 7 1.95
6 6 1.42
2 1 1.62
6 6 1.95
1: very inaccessible
7: very accessible
Satisfied SC Non-satisfied SC
73% 27%
69% 31%
52% 48%
36% 64%
Why this mismatch?
• Guidelines vs. user problems
• Eliciting users’ problems– Experience/report problems differently: web
expertise
– Interaction context, tasks type
• Identifying accessibility issues– Combination of causes, contextual
information
– Under the influence of the evaluator effect57th April - W4A’14 Are Users the Gold Standard for Accessibility Evaluation?
Discussion
• Inclusive participatory evaluation
– Following the considerations of co-operative
evaluation
– Evaluator and user evaluating a website
collaboratively
– Instead of interpreting data afterwards, work
on problems during the session
• Improve effectiveness
67th April - W4A’14 Are Users the Gold Standard for Accessibility Evaluation?
Conclusions
• User testing may not be the gold standard
for web accessibility evaluation
• Users are a gold mine
• Co-operative accessibility evaluation
• Bridge the gap?
77th April - W4A’14 Are Users the Gold Standard for Accessibility Evaluation?
Thank you!
87th April - W4A’14
Contact
[email protected] | @amaiaaizpurua
[email protected] | @sharpic
[email protected] | @markelvigo