mobile application qa & testing · 2019-01-21 · “procuring, managing and maintaining...
TRANSCRIPT
Mobile Application QA & Testing
Agenda
• Who is this guy? • Mobile Disruption & Adoption • Unique challenges of mobile testing • Mobile testing best practices • Cognitive test design & script-less automation
in the mobile space • Continues testing and Sentiment Analysis
About me
Mobile is here and making a difference
2011 The year mobile platforms surpassed traditional platforms IDC …Which happened 4 years earlier than predicted in 2009
10 Billion devices by 2020
24 Million Smartphones in UK - up from 30% to 52% of population in 1 year
Smartphone users 48% 39% <30 yrs
30-49 yrs
8x Faster adoption than the internet and if tablets are included, even faster Morgan Stanley
31% of Smartphone users have transacted online 56% of these users do so every month
Mobile is creating new opportunities
Mobile is about transacting
1
60% of e-commerce traffic and 44% of sales are from mobile devices as of 2016
Mobile enables the Internet of Things
91% of mobile users keep their device within arm’s reach 100% of the time
5 Trends with significant implications for the enterprise
2
4 3
Mobile is primary
5 90% of users use multiple screens as channels come together to create integrated experiences
Mobile must create a continuous brand experience
Global Machine-to-machine connections will increase from 2 billion in 2011 to 18 billion at the end of 2022
Insights from mobile data provide new opportunities 75% of mobile shoppers take action after receiving a location based messages 2
These opportunities translate to client preference for high value applications such as finance & insurance
Application Engagement (Flurry, October 2012, http://blog.flurry.com/ ) The state of mobile technology adoption (Forrester, October 2012) Digital Data Gems (H2 2012 Europe), (comScore MMX, June 2012)
Extension of the current offerings
7
Customers
Consumerization of IT
Analysis Phase Build Phase Deploy Phase
Operations Team
Development Team
Line of Business
Analyze Design Implement
Test
Manage
Run
Monitor
But complexities of these applications create their own unique pain points…
Marketing, Customer Service, Sales
• Ensure quality on combinatorial number of configurations of devices, platforms, carriers, etc.
• May need access to large library of mobile devices for testing
More direct involvements from users/stakeholders
LoB demands mobile apps as a way to drive brand value
Native programming models are not portable across devices
Optimize
Fragmentation of mobile devices and platforms
Governance and Best Practices
Higher expectations of user experience Lack of best practice guidance on
how to deliver mobile applications
• Mobile landscape evolves at a much faster pace
• More frequent releases and updates with urgent time-to-market demands
Connecting the enterprise back-end services in a secure and scalable manner
9
And makes end-to-end testing longer
Mobile Complexity is “Below the Glass”
Technology is complex, fast changing, with few standards: Significantly more complex than the internet
when it was new Multiple platforms, handsets & tools /
testing and maintenance issues
Pressure to launch - means many tactical decisions taken, often means that technical issues are left undiscovered / unresolved; creates potential constraints for the future
Key to success for mobile solution is to clearly understand business requirements and align them with the IT components and architecture
Perceived app quality is influenced as much by design quality as it is by functional quality
• User experience is critical for mobile applications
• Mobile applications typically require you to rethink how your customers interact with your business
• Line of business expects these applications to improve customer satisfaction, drive engagements, and loyalty
• Planning tests against all combinations of devices, OSes, carriers in fragmented market yields exponential number of test cases
• Testing is complicated by unconventional ways of interacting with mobile devices (camera, accelerometer, gestures, speech)
• Maintaining large library of devices in-house is cost prohibitive
Design Quality Functional Quality
Goal: deliver apps that align with business goals and are perceived as high quality – both from a user experience and functional point of view
Many challenges with Mobile Testing are unique …
“There are potentially thousands of combinations of devices and operating systems that access our mobile app or website, we can’t test them all”
“The mobile market changes quickly, how do we keep track of what is going on and update our test scope accordingly?”
“Procuring, managing and maintaining devices for testing can be expensive and burdensome”
“How do I validate location based services and network connections within test without sending my test team out of the office”
Is there an approach to implementing automated testing for mobile? What are the risks?
“What device specific behavior should be validated during test for native and hybrid apps?
The visual interface and app performance is important for user experience but how do we test this with so many combinations of devices?
Establishment Establishment
Implementation Steady State
Measurement period, operation services stabilization, KPI review
Standard operational phase with KPIs
Implementation of testing processes and methods
Tool and infrastructure planning
Ongoing Governance
Test Planning, Design and Execution
Lessons Learned Review
Initiation
Transition Planning
Preparation
Governance
Establish Work Intake
Deliverables Operational model plan Transition plan Knowledge transfer plan Process review Enterprise test strategy Estimating process
Test Planning, Design and Execution
Ongoing Work Prioritization and Intake
Establish Program Metrics
Define Automation
Metrics & Reporting
Start Automation Setup Standard Templates & Methods
Set up Delivery Centers
Review existing processes
Onboard resources Deliverables Automation strategy Implement processes &
methods Define program metrics Deliver mobile testing
environment RACI
Set up GR centre
Deliverables Work intake (Steady State) Plan and execute Measure releases Deployments Lessons learned Implement improvements
Implement Improvements
12
Operational Model Planning
Knowledge Transfer Planning
Establishment Execution Execution
Tool and infrastructure setup
Tool and infrastructure setup verification – Functionality, Security, Connectivity, etc.
But the fundamentals of software testing still prevail
We can drive innovation into mobile testing using new test case optimization techniques that control coverage and test case proliferation
• Systematic approach to modeling things that need to be tested
• Reduce # of test cases while ensuing coverage of conditions & interactions
• Create a model of the application by examining points of variation. Includes:
• Attributes: Points of variability • Restrictions: Rules that determine which combinations
of values are included and which are excluded from the model
• Interaction levels: Level of combinations of attribute values that should be tested together
Existing Tests
IBM CTDSolution
Find Gaps, Add New Test
Cases to Close Gaps
Generate Brand New Test Plan
Remove Overlapping Test Cases
Test Data
Describe Test Data
Requirement
Condensed and
Dedicated Test Data
Fewer Test Cases
New Test Cases to Close Gaps
Brand New Test Plan
Existing Tests
Test Model
• Total 40% test case reduction • Improved coverage to 100% • 22 modules, 5,000 test cases analyzed • Reduced to 3,200 test cases • BPT Component Rationalization, Process &
Governance Improvement • Addressed key mobile specific testing
requirements (e.g. screen size)
Canadian Bank Results – Mobile App
And we can rethink management of devices to avoid uncontrolled proliferation
Device Cloud & Simulators/Emulators
Testers (Local or Remote)
Field Testing (Test & Prod Env’t)
B a c k - E
n d & H o s t S
y s t e m
s
S e r v e r
W o r k l i g h t
W e b S e r v
i c e s
W e b S e r v
e r
Test Environment Device Basket
(Physical Devices)
Testers (Local)
B a c k - E
n d & H o s t S
y s t e m
s
S e r v e r
W o r k l i g h t
W e b S e r v
i c e s
W e b S e r v
e r
Crowdtesting (Independent Devices)
In the Wild (Prod’n Env’t)
Testers (Remote Dispersed)
B a c k - E
n d & H o s t S
y s t e m
s
S e r v e r
W o r k l i g h t
W e b S e r v
i c e s
W e b S e r v
e r
•Physical Device Acquisition & Management •Device Storage •Site Management
•Physical Infrastructure & Site Set-Up •Perfecto Cradle Set-Up •Site Management •Device Management •Cradle Swaps
•Device Procurement
Test Lab
15
Existing OS Existing Device
Existing App
Existing OS New Device Existing App
New OS Existing Device
Existing App
New OS New Device Existing App
Existing OS Existing Device
New App
Existing OS New Device
New App
New OS Existing Device
New App
New OS New Device
New App
Functional Testing
Sanity/Shakedown Testing P – D D * D * D * D D * D * D * User Interface Testing A – D A - D * A - D * A - D * D D * D * D *
Usability Testing A – D A - D * A - D * A - D * D D * D * D *
Interruption Testing A – D A - D * A - D * A - D * D D * D * D *
Business Processes Testing D D * D * D * D D * D * D *
Automation P P * P * P * P P * P * P *
Regression Testing P P * P * P * P P * P * P *
Non Functional Testing
Performance Testing P P * P * P * P P * P * P *
Installation and Deployment A – D A - D * A - D * A - D * D D * D * D *
Submission Guidelines Testing A – D A - D * A - D * A - D * D D * D * D *
Security Testing A – D A - D * A - D * A - D * D D * D * D *
* - Requires access to new devices, or new OS or both
Building the right mobile test lab necessitates thinking around test types and phases
Legend P = Cloud / Simulators D = Device Basket A = Crowdtesting
Key benefits Sufficient effective regression testing of apps within short release
windows in agile sprints Lowered costs of testing across regular and rapid releases Enabled movements towards continuous integration and testing and
the dev-ops benefits (scalable applications on demand capability and in consumption models)
Capability to meet business demands to interact dynamically with clients and the market.
Time Perspective
Automated Test Area KPIs
Time to validate build Execute time Hrs Time to smoke test Execute time Hrs Time to regression test Execute time Hrs Time to capture cases Days after release
Economy Perspective Effort per run hrs / test case
Time per case capture hrs / case
Ongoing support cost hrs / year
Maintainability hrs / impacted case
Number of cases run Cumulative # pa
Coverage & Effectiveness Perspective
Coverage of cases % test cases automated
Defects not detected Defect leakage
Key Performance Indicators (KPIs)
In mobile, automation is an essential lever for success as it reduces the chance of delays, cost & quality
Continuous testing & devops enablement for rapid throughput
Selective suites of automated scripts to deal with changes Foundational automation smoke tests
Automated regression suite for end to end testing Script once, run on many devices as possible (compatibility)
Data-driven testing by mechanisation (acceleration)
Mass adoption of mobile applications makes large scale performance testing essential
Commonly Used Tools: Rational Performance Tester (RPT) Perfecto Mobile w/ HP LoadRunner
Performance testing
Component Performance testing
Baseline Reference Testing
Stress Testing
End-to-end Performance Testing
Endurance Test ing
Scalability Testing
Component Performance Testing Evaluates and optimizes the performance and scalability of a single component or application
Baseline Reference Testing Controlled performance test for comparative analysis as changes are introduced
Stress Testing Stresses the system up to the “knee in the curve” to establish the maximum capability at the component level or end-to-end
End-to-end Performance Testing Provides a measure of response times, TPS and the resources utilized as the workload is increased
Endurance Testing A medium workload for a prolonged period of time to ensure resource are not depleting over time.
Scalability Testing Test how the system scales horizontally or vertically by establishing the scaling factor.
With all the new testing in mobile comes new types of defects
Crashing and unexpected terminations
Features not functioning correctly (improper implementation)
Using too much memory, not freeing memory or releasing resources appropriately and not stopping worker threads when tasks are finished.
Inadequate input validation (typically, button mashing)
State management problems (startup, shutdown, suspend, resume, power off)
Responsiveness problems (slow startup, shutdown, suspend, resume)
Inadequate state change testing (failures during inter-state changes, such as an unexpected interruption during resume)
Usability issues related to input methods, font sizes and cluttered screen real estate. Cosmetic problems that cause the screen to display incorrectly
Pausing or “freezing” on the main UI thread (failure to implement asynchronous threading)
Feedback indicators missing (failure to indicate progress)
Integration with other applications on the device causing problems
Application “not playing nicely” on the device (draining battery, disabling power-saving mode, overusing networking resources, incurring extensive user charges, obnoxious notifications)
Not conforming to third-party agreements, such as Android SDK License Agreement, Google Maps API terms, marketplace terms or any other terms that apply to the application
18
Tester End Users Developer
Over the air build distribution 1 In app bug reporting 2
Crash log reporting
4
In app user feedback 3
LOB/Digital Marketer
User Sentiment 5
Builds
User Feedback Crash logs
Bugs Bugs vs. Crashes
Quality Dashboard with Sentiment Analysis
Continuous testing helps us stay on top of all these challenges and ensures we deliver high quality mobile apps
“When someone leaves a negative review in the app store, it scars your app for life, you can’t respond to it, and you can’t learn more about the problem in order to fix it quickly.”
- Mobile Orchard
Sentiment Analysis provides rapid, specific client feedback and comparative industry analysis
Sentiment Analysis • Rapid feedback loop for subjective mobile and web user - satisfaction, usability, security, performance,
stability, pricing, interoperability, content, elegance, privacy • Evaluation against 10 critical quality levers / App store reviews / Comparative analysis (above)
Software Upgrade Device Release
Vendor Carrier Lab & Crowdtesting
“x” Weeks to Release Days to Release In-Market
Tim
e Sc
ope
Stak
ehol
der
Software & Device
GA Product Development
▪ Simulators ▪ Old Devices
▪ Old Devices ▪ New Devices ▪ Network
▪ Simulators ▪ Old Devices ▪ New Devices ▪ Network New Devices
MA
As new devices are released, the need for forward compatibility testing begins
How can we be better prepared for new major releases?
• Validate beta versions of the software on simulators and old devices
• Alignment between application provider QA and device manufacturer requirements
• Validate old devices against gold beta version of the software
• Utilise relationship with carrier to provide testing on pre-release devices
• Validate new release against simulators, old devices & new devices
• Utilise crowd testing to increase device combination coverage, obtain test team that come ready made with the devices you need to test with, validate actual behaviour of targeted users in the live environment
Succeeding in the mobile testing environment requires consideration of all these factors
22
Engagement Approach Test Strategy Test Design and Execution Delivery Options On-site Off shore Hybrid
Testing Approach Agile Test Driven Risk based Iterative Testing Guidelines, Strategies Process Maturity Best Practices Asset Repository Mobile Quality Metrics
Fully Equipped Test Lab Cloud platform Test Servers with Preconfigured
test Software Defined Test Architecture
Enterprise Mobile App Test Services Engagements Processes Infrastructure
Mobile Test Tools - Snapshot