user acceptance testing in the testing center of excellence
TRANSCRIPT
T20 Test Techniques
10/16/2014 3:00:00 PM
User Acceptance Testing in the Testing Center of Excellence
Presented by:
Deepika Mamnani
Capgemini
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com
Deepika Mamnani
Capgemini Deepika Mamnani heads the solutions arm of the Quality Assurance and Testing Services group at Capgemini. She is responsible for devising testing solutions and creating improvement roadmaps for testing organizations across industry verticals. Deepika’s core competency is conducting assessments for testing processes across software development lifecycles. An expert at defining organization and governance structures for testing organizations, she has helped organizations implement centralized testing centers of excellence. Deepika is a speaker at international testing conferences, conducts webinars on QA topics, and is a CSTE, CSQA Certified, and a Certified ScrumMaster.
UAT in the TCoE
September 2014
2
UAT in the TCoE | September 2014
Best practice 1: UAT in a nutshell
The choices we make today impact our future
Risk Profiling
UAT Test planning
Estimation
Execution
3
UAT in the TCoE | September 2014
Best practice 2: Risk profiling factors
Criticality
Internal versus
External
User Base
Revenue
ROI Complexity
Business Rules
Technology
Interfaces
Data
If there is no risk there is no reward
4
UAT in the TCoE | September 2014
Best practice 3: Methodology for determining team size
! A combination of Top Down and Bottom Up techniques ! Percentage of overall budget(15 – 40%)
UAT CoE Project Roles Shared CoE Roles Test Lead BA.QA Program Manager Automation Tester Environment Analyst
Application 1 2 3
1 2 5 Application 2 1 5
Application 3 2 7
Application 4 1 4
Successful repetition is the foundation of a methodology
5
UAT in the TCoE | September 2014
Best practice 4: UAT Test types break up
Functional 60- 70%
Usability 10-15%
Non Functional
Performance
Other 15- 20%
Test type
There is no point in digging shallow wells in a thousand places
6
UAT in the TCoE | September 2014
Best practice 5: Techniques to determine acceptance criteria
As a user I would like to create an order on an ecommerce portal.
Negative Flow
Number of
Devices
Orders
Technique 1 Mind Maps
Technique 2 Process Workflow – Design acceptance criteria
Create an order line
Process Order
Deliver Order
Create and order
Repeat and Repeat till it becomes second nature
7
UAT in the TCoE | September 2014
Best practice 5: Techniques to determine acceptance criteria
Technique 4 Decision Tables – Test Data – Unit and Automated Tests
Technique 3 Brain Storming on Testing quadrants
Input 1 Input 2 Outcome
User 1
User 2
Items > 100
Items > 400 USD
20% discount
Free shipping
iOS
Positive Flow
Windows
Android Orders
Repeat and Repeat till it becomes second nature
8
UAT in the TCoE | September 2014
Best practice 6: Degree of automation
UAT
API Functional Regression
Unit Performance
Security
! Subset of earlier tests ! Test Data ! Environment
Automation is the means and not the end
9
UAT in the TCoE | September 2014
Best practice 7: Risk based testing based on data
Entitlement Engine (Y)
Profile Engine (X)
Rules Matrix Risk-based approach
! Functional criticality ! Frequency of use ! Business impact
Rules Engine
Exhaustive testing is impossible
10
UAT in the TCoE | September 2014
Best practice 8: RACI Matrix
Activity UAT team
Client SMEs
SIT Team
Infrastructure team
Development team
Prepare UAT Strategy R C C C I
Prepare test scenarios R C C - -
Test Data generation R C - - C
UAT Release Notes I C R - -
Test Environment readiness R I C A C
Test Execution R C A A A
Defect Management R A A C A
Metrics management R I I I I
UAT Summary Report R C C I I
There is no right without responsibility
11
UAT in the TCoE | September 2014
Best practice 9: UAT SLA’s
SLA Definition Norm
Defect Detection Ratio
Defect Rejection Ratio
Ability to detect severity 1 defects
Ability to capture true defects
>=95%
<=5%
Reverse SLA Definition Norm
Environment downtime
Defect Turnaround time
Availability of environment
Availability of defect fixes
<=5%
Severity 1<= 1 day
What is measured improves
12
UAT in the TCoE | September 2014
© 2009 Capgemini - All rights reserved
UAT TCoE for a credit card services group
Case Study:
Snapshot Solution Measurable Results
Business Challenges: ! Bandwidth bottlenecks of
business users due to production responsibilities
! Dated UAT scripts ! Inadequate time for testing ! High number of post
production defects ! Lack of testing processes ! High operational costs
! Acquired application knowledge through close interaction with business users
! Rewrote scripts from outdated manual scripts and created additional scripts for improved test coverage
! Created repository of regression test scripts ! Established test management plan ! Performed UAT for each release with
measured continuous improvement in testing effectiveness
! Set up dashboard
! Application defects cut by 63%
! Reduced testing cycle time
! Scripts repository enabled script reusability
! Faster application rollout enabled launching new products more quickly
! Reduction in cost of operations
12
13
UAT in the TCoE | September 2014
© 2007 Capgemini - All rights reserved
Benefits – Decreased Business user involvement in UAT
Year 1
Testing FTE60%
Business Testers
40%
Year 2
Testing FTE64%
Business Testers
36%
Year 3
Testing FTE80%
Business Testers
20%
! End user perspective ! Reduction in business user
involvement in the UAT process
14
UAT in the TCoE | September 2014
© 2007 Capgemini - All rights reserved
Benefits – Reduction in post production defects
! Year on year Quantifiable decrease in production incidents
! Decrease in overall cost
! Freeing up client resources for future projects ! Optimal leverage of existing domain knowledge/
skills ! Increase in testing maturity across the board
! 100% visibility into testing health and compliance from the metrics-driven Testing Dashboard
0
200
400
600
800
1000
Year 1 Year 2 Year 3 Year 4
Decrease in production incidents
0
10
23
0
5
10
15
20
25
Year 1 Year 2 Year 3
# of
Pro
ject
s
Number of Projects done by CapgeminiCOE team as Test Managers