plan vs reality software engineering...
TRANSCRIPT
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Lecture 4:Monitor & Control - Ch 9 except 9.6, 12.4 [Hughes]Quality Managment Ch 13, except 13.7 and 13.9-10
SPI P4-Sect 3.2, P5-Sect 1-3, P6
Software Engineering Process– Economy & Quality
ETSF 01http://cs.lth.se/etsf01
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Plan vs Reality
Roadworks!
Accident!
Sick passenger
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
• Plan ≠ Reality. Not ’facit’!• Stuff happens on the way!• Aim is to reach
• Scope: good product• Budget: cost• Timeliness: market window, commitments etc
• Aims also change!
►Monitor & Control
Schedule vs Reality
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Monitoringto check if project is on track relative plan
Based on data = measurements!– Reports– Subjective data on completion rate– Actual cost vs planned cost– Actual value vs planned value
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Reporting
Formal – Informal, Oral – Written, Regular - Ad Hoc
Formal InformalRegular
Oral Re-occurring progressmeetings, eg stand-ups
Around coffee machine
Written Job sheets, progressreports
Emails to knowncollegues
Ad HocOral Review meetings Ad hoc meetingsWritten Issues reports, change
requestsEmails for issuesinvestigation
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Gantt charts
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Slip charts
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Cost vs Time
Timesheet exampleQuantitive cost & remaining effortThin on qualitative info, e.g. issues, actual state of progress.
Actual relative planned cost ≠ actual relative planned time• Project can be behind time but under budget
Example: Delayed due to not deploying committed staff
• Project can be on time but over budgetExample: additional resources have been added to cope with work load
→Need to monitor both achievements and costs
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Collecting progress detailsNeed to collect data about:• Achievements: Value• Effort spent: Costs
Exercise on partial completionDeveloper has produced 250 lines of Java code for a task
estimated at total 500 loc.
Questions1.Reasonable to assume task 50% complete?2.Why / Why Not?3.How can PM deal with this?
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Collecting progress details
Need to collect data about:• Achievements• Costs
Exercise on partial completionDeveloper has produced 250 lines of Java code for a task
estimated at total 500 loc.
Questions1.Reasonable to assume task 50% complete?2.Why / Why Not?3.How can PM deal with this?
Possible answers1. Not necessarily2. Possible issues
• Incorrect estimate• Non-typical or uneven progress so far• Test & debugging not started yet
3. More (qualitative) knowledge of progress& remaining sub-tasks.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Burn-Down Charts (Scrum)
time
Remainingeffort
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Fever Chart (Critical Chain)
timeLund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Example from a SW Porting Project
Kim: developer, 27 yrs old, worked in team for 3months
Kim’s task is going into “yellow” – “red”
What should PM do?
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Monitor via Reporting
ProjectSponsor /Director
SteeringCommittee
Projectmanager
PM forsubproject
Legacy
TeamMedia
TeamStorage
Team Leaderfor Media
Player
Teammember:MM Dev
Teammember UX
Design
TeammemberTester
Teamleader forFile Sys
Teammember
FileS Dev
TeammemberTester
Customer
Formal reporting via organisational structures
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Reporting
• details relating to project progress have to originatewith the people actually doing the work
• then to be fed up through the managementstructure.
• At each management level there is going to besome summarising and commentary beforeinformation is passed up to the next level.
• danger of ‘information overload’ as informationpasses from the many to the few.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Assessing progressAt predetermined checkpoints
• Event driven• Time driven
or continuous with dashboardsFrequency and level of reporting
• Corresponding to organizational level• Higher risk → more frequent
Red/Amber/Green or Traffic Lights
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Project vs Line Organisation [12.4]
PM
FeatureA
FeatureB
SystemTest …
Task (usually Project)
Manager
Analysis Coding Testing …
Functional (usually Line)
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Project vs Line Organisation [12.4]
PM
FeatureA
FeatureB
SystemTest …
Task (usually Project)
• Short-term, day-to-day tasks• Report task status etc
Manager
Analysis Coding Testing …
Functional (usually Line)Employment: salary, comp dev
• Long-term / employment issues,e.g. salary, competencedevelopment.
• Report working hours, sickleave etc
potential ProjectSponsor
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Matrix (Matris) Organisation
Nougat(Porting)
Product X App A …
Max ASystem architects
Arch Nils Arch Nils Arch Stina
Peter OSystem testing
Tester Anna Tester Anna
Maria RSW Dev: Audio
Dev Daniel Dev Daniel,Marie,Oscar
Helen PSW Dev: CallLi
nem
anag
ers
Project managers
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Project Organisation
Line Organisation
Sponsor
SteeringComm
Project Mgr
SubprjPM
TeamMedia
TeamStorge
Team Lead (TL)Media Platform
MMDev
UXDesigner Tester
TL FileSystem
extension
FileSDev Tester
Customer
CEO
Business
CustomerAccounts Roadmaps
Software Dev
Dev
Multimedia
MMDevs
Filesyst
FSDevs
Testing
Functesters
Systemtesters
ProjMgt
SDProjectmgrs
User Experience
ProjectMgt
UXDesign
UX Designer:MM
UX Designer:General
Line Project
Long term, employed Short term, assigned
Competencedevelopment
Engineering tasks,development
Usually low % - Job Main work effort -Project
Report working time,vaccation etc to linemanager
Report status,remaining effort etc toteam leader or projectmanager
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Prioritized monitoring [Hughes 9.7]
Monitoring and reporting has a $-tag!
Focus on monitoring based on risk• Critical path activities: if delayed later dependent
activities are delayed• Activities with less than a specified float• High risk activities: E.g. top 5-10 risks• Activities using critical resources+ activities with external dependencies+ resource allocations
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Control, or Getting back on track[Hughes 9.8]
• Renegotiate deadlines• Shorten critical path• Reconsider activity dependencies
Understand the issues! E.g. missinginformation, weak tools, non-functioning teametc etc etc
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Control, or Getting back on track[Hughes 9.8]
• Renegotiating deadlines: staggered deliveries according to customervalues.
• Try to shorten critical path by adding resources– Overtime– Re-allocate existing staff to more critical activities– Get more staff
• Reconsider activity dependencies– Over-lap activities to avoid waiting for completion of another– Split activities to remove dep. to activities / critical resources
! Possible aspectsfor practical exam
Q…
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Control by Modifying the Target
• Re-negotiate the commitment: time (deadline), cost or scope– Reduce scope &/ quality– Increase cost or deliver later. Consider incremental delivery!
For changes that affect business case (scope-cost-time) involve• All affected stakeholders• Inform and gain approval from steering committee and project
sponsor• Ensure affected parties are informed Change
control[Ch 9.9]
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
CASE PROJECTSMonitor & Control
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Weekly status report collected by PM fr teams & tracking systems- Progress relative delivery scope & timeline- Software quality status (performance)- Risks and Actions
Presented at project meeting &to steering group & sponsor
PPSPi2 Integration statistics
0
5
10
15
20
25
30
upto
wee
k33
9
340
341
342
343
344
345
346
347
348
349
350
351
352
401
402
403
404
405
rest
of20
04
Week number
Num
bero
fbub
bles
MS2 plan
Current plan
Integrated
Number of CCBbubbles
SW Porting Project: Monitor and control
SummaryThe project is in the execution phase andthe project includes 97 SW deliveries(bubbles) in the Anatomy.28 SW deliveries (bubbles) are nowdelivered.A checkpoint is scheduled in week 48.5 inorder to summarize the status and to decidehow to move forwardCritical areas are:1.Deliveries are pushed forward, see belowintegration statistics.2.Graphical performance. The graphicalperformance is only 25% of the expectedperformance. Root cause analysis isongoing3.Quality problems with the FM-Radio RDSchip. Re-planning is ongoing.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
SW Porting Project: Monitor & ControlResource allocation monitored on a monthly basis
0
1
2
3
4
5
6
7
8
Pers
ons
Function Groups
November - Requested versus Allocated
RequestedAllocated
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
SW Porting Project:Monitor & Control of COST
Cost monitoringSYSTEM ProjectOrder & Cost (KEUR)
TotalActual
TotalForecast
OctoberActual
OctoberForecast
Man Months 92 130 19 27
Labour hours 12 989 18 302 2 685 3 779
Labour costs 1 348 1 830 290 378
Material/Consumables 100 60 10 20
Travel & Living 11 39 3 5
Consultants 10 20 5 7
Misc 2 5 1 2
- Reported once a month & at checkpoints to steering group- Extracted from internal systems- View progress, not just spenditure
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Application Dev Project:Monitor & (Team) Control
• Regular feedback & knowledge share– Daily stand-up meetings– Sprint demos & planning, sprint retrospectives (SPI)
• Burn-down charts used to monitor progress & ”remaining work”• Dependant projects
– Status reporting delivered to SW Platform & Product projects– Status reports received from, e.g. SW
porting project. Info on dependentfunctionality & deliveries, considered insprint planning as part of backlogprioritization.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Summary
Two main PM tasks• PLAN• GUIDE and FACILITATE; monitor & control a projectto meet agreed goals and targets
Monitor and control• Monitor through analysing, compiling incoming reports• Take corrective and adjustive action when needed• Manage change in a structured way• Coordinate roles and people (project meetings etc)
Recommendedexercises:9.1 and 9.2
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Updated weekly – see course page
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Revised DeadlinesDraft 2 in Moodle Friday 28 April• Classes A, B: latest 8.00, Class C: latest 11.00, Class D: latest 13.00Review in Moodle 2 h before exercise class on Wed 3 MayPresentation material: E-mail to Johan/Daniel 26 h befor Ex 4 (Tu 16/5)
• 8 min presentation per group on 1 SPM area (assigned by TA)• Another group acts as opponent (5 min) (assigned by TA)
• Feedback on presentation technique• Qs on presentation
NOTE: Part of project assessment => Active participation for passFinal report: Fr 19/5 kl 8:30 (strict!)
Submit via email to [email protected], [email protected] subject line ‘Report’ + <Group ID> + <student IDs of group>,e.g. ‘Report from Group Z: ain09aha, jur10eib…’
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Magnus LidholmSenior Project Manager,Sony Mobile Communications
Magnus has 20 years experiences ofsoftware development and has workedfor Ericsson, MA-System (PipeChain),Sony Ericsson and Sony.Based on his experience as a technicalproject manager Magnus will talk abouthow to handle people in differentsituations.
FöreläsningIMORGON
Ti 25/4 kl 13
Managing People
Teori pass +Gästföreläsning
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
QUALITY MANAGEMENT & SPIHughes Ch 13, except 13.7 and 13.10, P4-Sect 3.2, P5-Sect 1-3
Från kursplanen: För godkänd kurs skall studenten• ha förståelse för hur kvalitetsarbete på organisationsnivå går till• kunna beskriva hur mjukvaruprocessförbättring går till
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
• Hughes 13 except 13.7 & 13.10• Section 3.2 of P4 – Overview/Summary
Bjarnason, Sharp, Regnell, “Gap Finder: Assessing and Improving theIntegration of Requirements and Testing”, in PhD Thesis, LU-CS-DISS: 2013-02, Lund University, 2013.
• Section 1-3 of P5 – Intro to CMMIHöggerl, Sehorz, “An Introduction to CMMI and its Assessment Procedure”,Seminar Paper from Dep of Comp Sc, University of Salzburg, Feb 2006.
Course material forQuality Management andSw Process Improvement (SPI)
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Quality in a Project
Process (L4)
SW Project Management,Organisation & People
Testing(ETS200):
Achieved QRs?
ProductQuality
Requirements(ETS170):
Specify qualityrequirements (QR)
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Process – Product Quality
• Manufacturing: obvious, e.g. set-up of automatedtools and product checking processes.
• Software: Less obvious, much more creative anddesign-intense activity. Innovative. =>
PEOPLE involved big influence on quality
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Project success
Primarily four types of shortfall• Delays• Inadequate product functionality• Inadequate product quality• Cost overruns
Quality & ProcessmanagementTesting (ETS200)Requirements(ETS170)
Productscope
Time
Cost
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
0.Select project1. Identify
project objectives
2. Identify projectinfrastructure
3. Analyse projectcharacteristics
4. Identify productsand activities
5. Estimate effortfor activity
8. Review/ publicizeplan
6. Identify activityrisks
7. Allocateresources
9. Execute plan
10. Lower levelplanning
For eachactivity
Quality Concerns in SPMMain1.Quality-related objectives2.Installation standards and
procedures affect quality3. Quality requirements identified→ suitable process
4.Necessary activities inclin/out/process for them to reachdesired product quality
5.Ensure that quality is included inestimates
6.Risks often affect quality + lowquality is a risk!
7.Right competence &experience?
9.Measure & Manage for quality
RE
RE
RE
RE ETS170
Test
Test
Test ETS200
Process
Process
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
What is Product Quality?
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
How is (Software) Quality Defined?
• Quality is “fitness for use” - Joseph Juran• Quality is “conformance to requirements” - Philip B. Crosby
Quality of a product or services is its ability tosatisfy the needs & expectations of the customer
Quality is about meeting the minimum standard required tosatisfy customer needs.High quality products meet the standards set by customers.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Quality is different thingsfor different people
• Banker: safe & reliable service• Healthcare worker: safe & timely quality health care• Hotel employee: customer satisfaction• Development engineer: bug-free product
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Balancing Cost vs Benefit (Value) of Quality
Useful
Useless
Competitiveadvantage
Excessive
Utility breakpoint
Differentiation breakpoint
Quality level
Benefit view
Saturation breakpoint
QUPER: Quality Performance Requirements Model
Supporting Roadmapping of Quality Requirements. Regnell, B., Berntsson Svensson, R., Olsson, T.(2008) IEEE Software, Vol.25(2), pp.42-47
Quality level
Cost view
barrier
Implcost
Hi vs Lo Quality changes over time, eg. start-up time, battery time,screen clarity, etc
User benefit
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Quality is relative
Examples• High quality washing-up liquid can claim
that one squirt is sufficient – customer prepared topay $
• Poor quality washing-up liquid requires severalsquirts – customer accepts if cheap
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
ISO 9126 software qualities
functionality does it satisfy user needs?
reliability can the software maintain its levelof performance?
usability how easy is it to use?
efficiency relates to the physical resourcesused during execution
maintainability relates to the effort needed tomake changes to the software
portability how easy can it be moved to anew environment?
The quality attributesto be covered byevaluation framework!
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
FunctionalitySuitabilityAccuracyInteroperabilityFunctionality complianceSecurityReliabilityMaturityFault-toleranceRecoverabilityReliability complianceUsabilityUnderstandabilityLearnabilityOperabilityAttractivenessUsability compliance
EfficiencyTime behaviourResource utilizationEfficiency complianceMaintainability“Analysability”ChangeabilityStabilityTestabilityMaintainability conformancePortabilityAdaptabilityInstallabilityCo-existenceReplaceabilityPortability conformance
ISO 9126 software qualities
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Quality Management:An holistic approachis needed
The QM gurus (Deming, Juran, Crosby) agree on that• Inspection is never the answer to quality improvement,
nor is “policing”
• Involvement of leadership and top management isessential to the necessary culture of commitment toquality
• A program for quality requires organization-wide effortsand long term commitment, accompanied by thenecessary investment in training
Product qualityRE
Process Plan
Estimates People
Testing
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Total Quality Management - TQM
Doing things RIGHT…..
….the FIRST time,
every time!
PUST –misslyckatIT projekt
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
PUST projektet: 2 instanser• Vision/mål
1. skräddarsydd lösning f fältarbete. ANVÄNDBARHET2. Siebel: spara pengar med standardlösning (billigare
underhåll) => anpassa verksamheten efter tekniskamöjligheter. Orealistisk & biased förstudie (Oracle).Mäta användbarhet => strök kravet!
• Arbetssätt1. inkrementell, användarinteraktion, utbildning2. ”stängt” projekt, ingen användarutbildning
• Resultat1. nöjda användare, proprietärt system2. missnöjda anv, ”hacka sönder” -> dyrt underhåll/utv
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
B. Bergman and B. Klefsjö, Quality: from customer needs to customersatisfaction, Studentlitteratur, 2003
Let everybodybe committed
Base decisionson fact
(measurements)
Improvecontinuously
Focus onprocesses
Focus oncustomers
Committed leadership
TQM Approach
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
TQM• Focus on customers: Find out what customers want,
internal and external• Base decisions on fact: Systematic (not random) info,
relative customers needs• Focus on process: incl project i/f, measure process perf• Improve continuously: PDCA cycle (Deming), Improve
quality of goods, services and products, processes andmethodologies while using less resources
• Let everybody be committed: Opportunities to becommitted, involved in the decision-making andimprovement work, Delegation of responsibility andauthority
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
TQM – Total Quality Management
B. Bergman and B. Klefsjö, Quality: from customer needs to customer satisfaction,Studentlitteratur, 2003
A total concept, where values, methodologies and toolscombine to attain higher customer satisfaction with lessresource consumption.
Origin•Deming (Am in Japan) during 1950s•By the mid 1970’s Japan was beginning to seriouslyundermine its American and other western competitors
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
The Deming Cycle – PDCA (Sv: PUMA)
•Plan processchanges toimprove results
• Implement plan /change
• Decide onneededchanges
• Check/Study/Assessthe outcome;measureand report
Check Act
PlanDo
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Software Processes
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
F Pettersson, M Ivarsson, T Gorschek, A Practitioner’s Guide to Light Weight Software Process Assessment andImprovement Planning. Journal of Systems and Software 81(6):972-995, 2008
Software Process Improvement
General steps for SPI1. Evaluation of current practices (Check)2. Planning for improvements (Act & Plan)3. Implementing improvements (Do)4. Evaluation of effects (Check)
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
SPI Approaches
Inductive (bottom-up)Identify problems in current practicesand improve
E.g.• QIP, iFLAP, Lean Six Sigma• Process modelling and simulation• Information flow analysis• Retrospective reflection aka
Lessons learnt, Project post-mortem
Prescriptive (top-down)Compare current practicewith best practice
E.g. CMMI, SPICE(ISO15504)
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Prescriptive (top-down) SPI Frameworks[Hughes 13.8, P5 – Sect 1-3]
CMM/CMMI
SPICE/ ISO/IEC 15504
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
SPICESoftware Process ImprovementCapability dEtermination model
• Developed from CMM etc• Parts of SPICE defined as ISO standard
(ISO/IEC 15504 1-4)• Model consists of
– Process dimension: assessment per process area– Capability dimension: how processes are impl and
managed• particularly appropriate for small organisations (ability
to focus on process areas)Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
• Software Engineering Institute– US Defense Dept. funded institute associated
with Carnegie Mellon University• Mission is to promote software technology
transfer particularly to defense contractors• Maturity model proposed in mid-1980s, refined in early
1990s (to CMMI).• Work has been very influential in process improvement
The Capability Maturity Model forSoftware (SW-CMM)
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
CMMI Maturity Levels (Staged)
Level 1 - Initial
Level 2 - Repeatable
Level 3 - Defined
Level 4 - Managed
Level 5 - Optimizing
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Level 1: Initial
• Frequently difficulties in making commitments• Crises common• Success depends entirely on having exceptional
managers and developers.• However, level 1 companies can deliver products
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Level 2: Repeatable
• Policies for managing software projects areimplemented
• Realistic commitments• Capability: disciplined, earlier successes can be
repeated
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Level 3: Defined
• A typical process for developing and maintainingsoftware in the organisation is defined
• A Software Engineering Process Group (SEPG) isdefined
• Capability: Standard and consistent – bothsoftware engineering and management are stableand repeatable
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Level 4: Managed
• The organization sets quantitative quality goals forboth products and processes
• Software products are of high predictable quality• Organization-wide metrics database• Meaningful variations can be distinguished from noise• Capability: predictable
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Level 5: Optimizing
• The whole organization is focused on continuousprocess improvement
• The organisation has the means to identify processweaknesses and take actions
• Cost benefit analysis possible
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
CMMI [P5]
• Contains 25 process areas, e.g. requirementsmanagement, quality assurance etc
• Each process area contains goals and practices• Two types of models
– Staged – grades the overall development process– Continuous – grades each process area
[www.sei.cmu.edu/cmmi]
Level 1 - Initial
Level 2 - Repeatable
Level 3 - Defined
Level 4 - Managed
Level 5 - Optimizing
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
CMMI: Staged vs Continuous
Staged• Assessed entire
organisation• Measured by lowest level• Static improvement order
Used in marketing
Continuous• Assessed per process area• Fine-grained measures –
more detail• Flexible improvement per
process area
More useful for internalprocess improvement
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Key process areas(Staged)
Level 1: Initial
Level 2: Repeatable• Project tracking and oversight• Project planning• Requirements management• Software quality assurance• Configuration management• Subcontract management
Level 3: Defined• Organization has a defined
process• Peer reviews• Training program• …
Level 4: Managed• Software quality management• Quantitative process management
Level 5: Optimizing• Process change management• Technology change management• Defect prevention
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Assessments
• Evaluation of processes against a maturity model• Objectives are to improve and to certify• Typical activities
– Archive analysis– Interviews, questionnaires, workshops– Process modelling
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Inductive (bottom-up) SPI
http://herkules.oulu.fi/isbn9514265084/html/x287.html
Quality Impr Paradigm (QIP)[Basili, 1984]
iFLAP [Pettersson, 2008]
Pettersson F., Ivarsson M., and Gorschek T., “A Practitioner’s Guide to Light WeightSoftware Process Assessment and Improvement Planning", Journal of Systems andSoftware, vol. 81(6), 2008, pp. 972-995.
Lean Six Sigma (LSS)Michael George and Robert Lawrence Jr., “Lean SixSigma: Combining Six Sigma with Lean Speed”,McGraw-Hill, 2002
• Process modelling, simulation• Information flow analysis• Retrospective reflection aka
Lessons learnt, Project post-mortem [P4 – Sect 3.2]
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
QIP• Basili’s QIP (1984), NASA and academic cooperation:
improvements based on a thorough understanding of currentprocess – all different. Solutions tailored based on actual criticalissues. No general “baselining” against a pre-defined set ofpractices.
• part of a larger system model: the Experience Factory [Basili]• Improvement at organizational level (after project completion) and
project (during execution) level.
iFLAP (BTH)Elicits issues and improvement suggestions from organization. Thenprioritizes and plans together with organization.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Lean Six Sigma (LSS)• Focus: elimination of waste + improved
performance & quality• Combination (2002) of
– Lean (1990s, mostly Toyota Production System)– eliminate waste, quality first etc.
– Six Sigma (Motorola, 1986) – 99.99966% ofproducts statistically expected to be free ofdefect (6 σ)
• A process for process improvement• LSS ’belts’ – training and certification
Master
Black belt
Green belt
Yellow belt
Measures &optimizesprocess flow
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Lean Six Sigma• Focuses on i) elimination of eight kinds of wastes (LEAN):
Defects, Overproduction, Waiting, Non-Utilized Talent,Transportation, Inventory, Motion, Extra-Processing)and ii) improved capability of performance.
• Lean Six Sigma concepts from 2002. Six Sigma [Motorola -86]: focuses on quality through RCA (root cause analysis)with focus on processes and flow/throughput of work
• Utilises the DMAIC phases similar to Six Sigma.• Belt-based training system (Six Sigma) similar to karate
– Master Black belt trained with at least 2 yrs experience.Teaches LSS
– Black belt Full time project leader– Green belt Focus on tools usage. DMAUC and Lean
principles application– Yellow belt Lean Six Sigma awareness
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Retrospective Analysisor Lessons learnt, project postmortem
• Consider the past in order to identify problems andimprovements – individual, but mostly in groups
• Often applied after project completion• Important SPI method for self-governing agile teams
– Sprint (iteration) retrospectives
Collier B, DeMarco T, Fearey P (1996) A Defined Process for Project Postmortem Review, IEEE Software, vol. 13, issue 4, pp. 65-72.Derby E, Larsen D (2006) Agile Retrospectives: Making Good Teams Great! Pragmatic Bookshelf, 2006Drury M, Conboy K, Power K (2011) Decision making in agile development: A Focus Group Study of Decisions and Obstacles. Proc. Of AgileConference 2011, pp. 39-47.
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Retrospective Analysis (cont)
• Benefits– Team learning & improvement– Widen perspectives & insight into bigger picture
• Challenges– Taking the time together!– Remembering… correctly and uniformly– Risk of incorrect conclusion for pure experience-
based
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Evidence-Based Timelines
Large devproject
Smallresearch
project
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Evidence-Based Timeline Retrospectives (EBTR)
E Bjarnason, A Hess, R Berntsson Svensson, B Regnell, J Doerr, Reflecting on Evidence-Based Timelines, Acceptedfor publication IEEE Software June/July 2014http://fileadmin.cs.lth.se/cs/Education/ETSF01/Material/Bjarnason_EBTR.pdf
Follow-up meeting
4) ValidationUpdated EBT
EBTR summary
3) Retrospectivemeeting
Project team
t
1) Preparations 2) Timelineconstruction
Facilitator
Process responsible
EvidenceEvidenceEvidenceEvidenceEvidence
GoalGoalGoal
AspectAspectAspectAspectAspect Evidence-based timeline•Prompt memory•Provides objective data•Stimulate & focus discussion•Identify connections
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Benefits of EBTs
• Fuel reflective discussions• Prompt memory• Provide objective discussions• Stimulate and focus discussions
• Provide overview• Facilitates identifying connections• Identifying improvements
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
CASE PROJECTS++Software Process Improvement
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
SW Porting (Trad)• Post-project meeting:
lessons learnt,postmortem
• Lean Six Sigmaimprovement projects
• Driven by line mngement
App Dev (Agile)• Sprint retrospectives
• Driven by team
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Quality Management & SPI: Summary
• Definition(s) of quality incl ISO standards• Factors affecting quality• Total Quality Management• Prescriptive SPI: CMMI, SPICE• Inductive SPI: Lean Six Sigma, Retrospectives
Recommended exercises: Ex 13.4-5
Lund University / Faculty of Engineering/ Department of Computer Science / Software Engineering Research Group
Project Completion
Succesful?Delivered scope & quality• on time• within budget
Gains• Quality product that meets customer need• New lessons learnt (for next time!)
– Experience– Process improvements
Scope +Quality
t
£