mobile test automation with big data analytics
DESCRIPTION
Organizations with a mobile presence today face a major challenge of building robust automated tests around their mobile applications. However, organizations often have limited testing resources for these increasingly complex projects, and stakeholders worry about the quality of the product. So how do you plan a mobile test automation project, recognizing the failure rate of such efforts? Discover how Tarun Bhatia used big data analytics to understand where customers spend most of their time out in the wild on their apps. See how they analyzed massive amounts of mobile usage data to create an operational model of carriers, devices, networks, countries, and OS versions. They then developed automation strategies resulting in better tests created with the right priorities. Learn how you can apply mobile automation capabilities in areas of continuous integration, performance, benchmark, compatibility, stress, and performance testing based on analytics data.TRANSCRIPT
T22 Concurrent Class
10/3/2013 3:00:00 PM
"Mobile Test Automation with
Big Data Analytics"
Presented by:
Tarun Bhatia
Microsoft
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com
Tarun Bhatia
Microsoft
Tarun Bhatia is a Program Manager in charge of driving the best breed of performance
measurements and analysis for Microsoft Online Office Division. In this role, Tarun leads
innovative strategies—analytics, performance, benchmark, compatibility—and guides the team
to create an effective, reliable, and robust monitoring architecture. With more than six years of
software development experience in quality and service assurance, Tarun shows that taking
initiative and thinking outside the box can deliver big results—both personally and for the
company.
9/19/2013
1
Mobile Test Automation
Using Big Data Analytics
Tarun Bhatia
Microsoft Corp.
Introduction quality as·sur·ance:
A program for the systematic monitoring and evaluation of the various aspects of a project, service, or facility to ensure that standards of quality are being met Source: http://www.merriam-webster.com/dictionary/quality%20assurance
9/19/2013
2
“
”
If you think you can,
or if you think you can’t,
you are correct. – Henry Ford
Question
How it Starts!
Stage 1
•Company needs mobile presence
• They hire Mobile Devs and Testers (usually manual)
Stage 2
• App becomes too complex to cover all the permutations via manual testing
• Company hires Automation Engineers (SDET) and are told to “automate everything”!
Stage 3
•Full-on effort to catch-up and automate all features
•Extreme SDET burnout!
9/19/2013
3
Creating a Plan
Successful Automation
Plan
Device Lab
Automation Framework
Prioritize Feature Test
Cases
Stress / Performance /
Other Additional
Testing
Create a Device Lab
9/19/2013
4
Create a Device Lab
Total # of Devices
Devices with most # of
reported bugs
Your most Popular Devices
Time box and add bug to
your backlog
Buy/Loan/Rent device and bring it in-
house
Create a Device Lab
Apple LG MS770 Samsung Galaxy SIII Microsoft
Coolpad Quatro 4G ZTE N9210 Samsung Galaxy Admire 4G Droid RAZR
Samsung Galaxy Note II LG Esteem LG MS870 Samsung Admire
Samsung Epic 4G Samsung Galaxy SII Samsung Omnia II Other
Total # of Devices > 1850!!
9/19/2013
5
Pick an Automation Test Framework
Prioritize
KPI
Usage Data
Revenue Stream Data
Marketing Data
9/19/2013
6
Real User Usage Pattern
Home Screen Detail Screen 1 Detail Screen 2
Detail Screen 3 All Other Values
Tests
Real User Data
Stress
Server Vs. UI Data
New Features
Performance
System
Under Test
Production data
Test Results
Run Tests
Quality
Assessment
9/19/2013
7
Stress Testing
Find Resource Leaks
Find App’s Capacity and Capabilities
Find Memory and Battery Consumption Trends
Server Vs. UI Testing
Server
Client Test Framework
• Verify data is in-sync during testing
• Ensure no data loss during test progress
• Detect UI TTL (Time to Load) on devices under various conditions
9/19/2013
8
Performance Testing (Analyze and
Record KPIs)
Effective Testing
Write Once,
Test Anywhere
Active Monitoring
Test Re-Use
Performance
Availability
9/19/2013
9
Conclusion