pushing the boundaries of user experience test automation' by julian harty

26
Pushing the boundaries of User Experience Test Automation @EuroStar 2011 JULIAN HARTY 15 th October 2011

Upload: eurostar-software-testing-conference

Post on 25-May-2015

523 views

Category:

Technology


1 download

DESCRIPTION

One of my current responsibilities is to find ways to automate as much as practical of the ‘testing’ of the user experience (UX) of complex web-based applications. In my view, full test automation of UX is impractical and probably unwise, however we can use automation to find potential problems in UX even of rich, complex applications. I, and others, are working to find ways to use automation to discover various types of these potential problems. Here’s an overview of some of the points I made. The work is ongoing and is likely to have demonstrated significant business benefits by the time of the EuroStar conference. In my experience, heuristics are useful in helping identify potential issues. Various people have managed to create test automation that essentially automates various heuristics. All the examples I’ve provided are available as free opensource software (FOSS). I’ve learnt to value opensource because it reduces the cost of experimentation and allows us to extend and modify the code e.g. to add new heuristics relatively easily (you still need to be able to write code, however the code is freely and immediately available). Automation is (often) necessary, but not sufficient Automation and automated tests can be beguiling, and paradoxically increase the chances of missing critical problems if we chose to rely mainly or even solely on the automated tests. Even with state of the art (the best we can do across the industry) automated tests I still believe we need to ask additional questions about the software being tested. Sadly, in my experience, most automated tests are poorly designed and implemented, which increases the likelihood of problems eluding the automated tests.

TRANSCRIPT

Page 1: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

Pushing the boundaries of User Experience Test Automation

@EuroStar 2011

JULIAN HARTY

15th October 2011

Page 2: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20112

What we want to achieve

To automatically detect (some) issues that may adversely affect the User Experience

Page 3: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20113

So what are the business problems?

• Quality-In-Use:– UX, Cross-browser & Accessibility issues on live site

• Engineering Productivity: – Time taken from feature-complete to live site

• Quality Engineering:– Over-reliance on manual testing– Existing test-automation under-performing

Page 4: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20114

And some UX Test Automation problems…

•Minimal existing work in Industry–Existing tools tend to test static aspects

• Academic work appears to be unsuitable

•How to automatically measure and test UX at all!

Page 5: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20115

UX = User Experience

•Dynamic, based on using the system

• We focus on the Human + Computer Interactions

HolisticIncludes Perceptions

Usefulness

FindabilityAccessibility

Desirability

UsabilityCredibility

Page 6: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20116

Using Heuristics*

1. When I navigate by tabbing I should return to where I started in a reasonable number of key-presses

2. Flow should be consistent through web-forms

3. Look-alike & work-alike across web browsers

4. I should not be able to bypass the security of the site from links in the UI

* Fallible, but useful, guide/approach/practice

Page 7: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 2011

Equivalence of input paths

Mouse

• Clicks…

Keyboard

• Keystrokes

7

New

Dropdown

Select

Send

Tab

Space

Down-arrow

Enter

Down-arrowDown-arrow

Down-arrowDown-arrow

Tab

Enter

4 inputs 10 inputs

Designing and Engineering Time by Steven Stow

Page 8: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20118

A real example: Information for Speakers

Mouseover; click = 2 steps 13 Tabs + 22 Tabs = 35 steps

Page 9: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 20119

Automated exploration and discovery*

• Automating heuristic tests–Web accessibility testing–Fighting layout bugs–BiDi checker

•Using Crawling tools–Crawljax

* We’re working on complementary work on interactive exploration and discovery

Page 10: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201110

Man & Machine

• Fully automated execution, human inspection

• Interactive testing, aided by tools

Page 11: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201111

The moving parts…

crawljax End-to-end tests

Fighting-layout-bugs

Web-accessibility-

testing

W3C validator

Security checkers

Cross-browser-checking

Our Tests Our Tests

WebDriver DE

CO

RA

TO

RS

[1]

A Grid of Web-browsers

[1] http://en.wikipedia.org/wiki/Decorator_pattern

Page 12: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201112

DEMO

Page 13: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201113

Sample screenshots

http://adsense.google.com

http://www.google.co.in

http://www.eurostarconferences.com/Registration.aspx?type=delegate&id=4

Page 14: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201114

And what about Static Analysis?

•Don’t overlook the value of static analysis (but don’t rely on it either…)

• Find ways to filter the reported results

•We’ve created automated Accessibility Tests, which check the contents of pages using WCAG guidelines.

Page 15: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201115

Problems detected:

{"testName":"checkMouseEventHaveKeyboardEvent","description":"Check that elements that have mouse actions also have keyboard actions.",

"severity":"error","elementCode":"<a href=\"#\" onclick=\"window.open('/Content/Terms-and-conditions.aspx', 'termsAndConditions','width=1000,height=600,resizable=yes,toolbars=0,menubar=no,scrollbars=yes,status=no'); return false;\"> Click here to read terms and conditions</a>"}

“checkTitleIsNotEmpty” for the entire page

"testName":"checkAltTextOnImage","description":"Check that visible images have alt text","severity":"error","elementCode":"<img src=\"/themes/Eurostar/images/ico-linkedin.png\" />"

Page 16: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201116

Now what?

“Sure, Accessibility is important: file it with the rest of the things we should do...”

What should be done seldom gets done...

Page 17: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201117

3 Improvements for the price of 1

• Accessibility + Testability– Both need to interpret the contents in the browser– Improving one often improves the other

• Accessibility + SEO*– Both find value in descriptive labels

[*] SEO: Search Engine Optimization

alt=“”

picture

Mens Black Levi 501 Jeans 32" Waist /32" Leg Excellent Condition

alt=“picture” alt=“Mens Black ...”

Page 18: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201118

Happiness

• Improved Accessibility: Users are happier

• Improved Testability: We’re happier

• Improved SEO: Business is happier

Page 19: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201119

Beware of (over-)trusting test automation

• Automation as servant, not master, of our software delivery– Inaccurate results– “Beware of Automation Bias” by M.L. Cumming[1]

• Automated tests and checks miss more than they find

• Make behavior easier to assess– Screenshots and contents of DOM to verify after tests ran– Automated video recordings

[1] http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.2634&rep=rep1&type=pdf

Page 20: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201120

What we think we’re testing…

Page 21: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201121

What our automated test actually interacts with…

Enter text here ‘click’

Page 22: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201122

(Some of the) things our automated tests may miss…

PromotionsNavigation

History

Layout, Rendering &Formatting

Problem JavaScript CSS and HTML

Page 23: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201123

Beware of poor-quality automated 'tests'

• AssertTrue(true);

• No/Inadequate/Too many Asserts

• Poor code design

• Falsehoods (False positives / negatives)

• Focus on:– Improving the quality of your automated tests– Finding ways to improve the quality, & testability, of

the code being tested

Page 24: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201124

Increasing the signal-to-noise of test results

• Suppress unwanted ‘warnings’–C.f. static analysis tools

• Increase potency of tests

•Consider dumping ineffective tests

Page 25: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

EuroStar 201125

Further reading and research

The opensource projecthttp://code.google.com/p/web-accessibility-testing

Finding Usability Bugs with Automated Tests http://queue.acm.org/detail.cfm?id=1925091 Fighting Layout Bugs http://code.google.com/p/fighting-layout-bugs/Experiences Using Static Analysis to Find Bugs http://www.google.com/research/pubs/pub34339.html My bloghttp://blog.bettersoftwaretesting.com/ “Beware of Automation Bias” by M.L. Cummingshttp://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.91.2634&rep=rep1&type=pdf

Designing and Engineering Time by Steven StowISBN 978-0-321-50918-5

Page 26: Pushing The Boundaries Of User Experience Test Automation' by Julian Harty

Questions now?

Questions later…

[email protected]

[email protected]

AcknowledgementsThanks to Bart Knaak and Jacob Hooghiemstra at StarEast 2011 who helped me improve the material by their comments and ideas; and to Damien Allison and Jonas Klink my colleagues on the opensource project.