small projects, big issues 4 · small project survival among the cmmi level 5 big processes jost...

32

Upload: others

Post on 08-Dec-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes
Page 2: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

2 CROSSTALK The Journal of Defense Software Engineering February 2008

4

9

14

19

22

27

38

1828293031

D eD e p ap a rr t m e n t st m e n t s

From the Publisher

Coming Events

Call for Articles

Web Sites

SSTC 2008

CrossTalk Feedback

BackTalk

Navigating the Enterprise ForestSmullin shows that getting integrated into the DoD enterprise andnavigating the approval process is a challenge, but with the properpreparation, both are much easier to navigate successfully.by Fred Smullin

Development Practices for Small Software ApplicationsJones describes the origins, benefits, and differences of processimprovement through the use of CMMI and Agile methods, as well asdiscusses the benefits of hybrid process improvement models.by Capers Jones

Is CMMI Useful and Usable in Small Settings? OneExampleThis article presents the motivation, the processes used, and the majorresults of the CMMI for a Small Business pilot from the perspective of theteam that worked on the pilot.by Sandra Cepeda, Suzanne Garcia, and Jacquelyn Langhout

Why Do I Need All That Process? I’m Only a Small ProjectThis article shows how to fix the issue of having a variety of project sizesbut a single set of processes originally built for larger projects.by Mark Brodnik, Robyn Plouse, and Terry Leip

Small Project Survival Among the CMMI Level 5Big ProcessesJost examines how his organization successfully learned to tailor its CMMILevel 5 processes for small projects.by Alan C. Jost

Field Guide to Provide Step-by-Step Examples forImproving Processes in Small SettingsThis article invites contributors to help develop a field guide for processimprovement on small projects with how-to guidance, examples, templates,checklists, and other information.by Caroline Graettinger, Suzanne Garcia, Christian Carmody, M. Lynn Penn,and William Peterson

Small Small PrProjectsojects,, BigBig IssuesIssues

OpenOpen FFororumum

Cover Design byKent Bingham

ON THE COVER

Additional art servicesprovided by Janna Jensen

CrossTalkCO-SPONSORS:

DOD-CIO

OSD (AT&L)

NAVAIR

76 SMXG

309 SMXG

DHS

STAFF:MANAGING DIRECTOR

PUBLISHER

MANAGING EDITOR

ASSOCIATE EDITOR

ARTICLE COORDINATOR

PHONE

E-MAIL

CROSSTALK ONLINE

The Honorable John Grimes

Kristen Baldwin

Jeff Schwalb

Kevin Stamey

Norman LeClair

Joe Jarzombek

Brent Baxter

Elizabeth Starrett

Kase Johnstun

Chelene Fortier-Lozancich

Nicole Kentta

(801) [email protected]

www.stsc.hill.af.mil/crosstalk

CrossTalk,The Journal of Defense SoftwareEngineering is co-sponsored by the Department ofDefense Chief Information Office (DoD-CIO); theOffice of the Secretary of Defense (OSD) Acquisition,Technology and Logistics (AT&L); U.S. Navy (USN);U.S. Air Force (USAF); and the U.S. Department ofHomeland Security (DHS). DoD-CIO co-sponsor:Assistant Secretary of Defense (Networks andInformation Integration). OSD (AT&L) co-sponsor:Software Engineering and System Assurance. USN co-sponsor: Naval Air Systems Command. USAF co-sponsors: Oklahoma City-Air Logistics Center (ALC)76 Software Maintenance Group (SMXG); andOgden-ALC 309 SMXG. DHS co-sponsor: NationalCyber Security Division of the Office ofInfrastructure Protection.

The USAF Software Technology SupportCenter (STSC) is the publisher of CrossTalk,providing both editorial oversight and technical reviewof the journal.CrossTalk’s mission is to encouragethe engineering development of software to improvethe reliability, sustainability, and responsiveness of ourwarfighting capability.

Subscriptions: Send correspondence concerningsubscriptions and changes of address to the followingaddress.You may e-mail us or use the form on p. 28.

517 SMXS/MXDEA6022 Fir AVEBLDG 1238Hill AFB, UT 84056-5820

Article Submissions:We welcome articles of interestto the defense software community.Articles must beapproved by the CROSSTALK editorial board prior topublication. Please follow the Author Guidelines, avail-able at <www.stsc.hill.af.mil/crosstalk/xtlkguid.pdf>.CROSSTALK does not pay for submissions. Articlespublished in CROSSTALK remain the property of theauthors and may be submitted to other publications.

Reprints: Permission to reprint or post articles mustbe requested from the author or the copyright hold-er and coordinated with CROSSTALK.

Trademarks and Endorsements:This Department ofDefense (DoD) journal is an authorized publicationfor members of the DoD. Contents of CROSSTALKare not necessarily the official views of, or endorsedby, the U.S. government, the DoD, the co-sponsors, orthe STSC.All product names referenced in this issueare trademarks of their companies.

CrossTalk Online Services: See <www.stsc.hill.af.mil/crosstalk>, call (801) 777-0857 or e-mail <[email protected]>.

Back Issues Available: Please phone or e-mail us tosee if back issues are available free of charge.

Page 3: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

From the Publisher

“Good things come in small packages.” Most people have probably heard this say-ing from the time they were kids. Small things just don’t seem to get the respect

they deserve – including small projects. Most of the articles in this month’sCrossTalk provide insightful advice for adapting large project processes for smallprojects. But what about the other side of the story? Small projects have much to offer.Maybe larger projects should consider what they can adapt from small projects. Threeareas that large projects may benefit from the lessons learned by smaller projects are

Agile techniques, bottom-up estimation, and communication improvements.There have been efforts to scale Agile practices to large projects. One such effort involves

breaking large projects into smaller projects. This might work for the small pieces, but whatabout when the pieces need to be brought back together? A consistent approach is still neededto pull them together. However, portions of Agile programming can still work on large projects.Pair programming is one such idea: It may seem reasonable that pairing software developers hasbeen shown to decrease errors, and contrary to conventional logic, pairing up programmers alsohas been shown to increase productivity. The cyclical development of agile programming alsoscales to large projects and is discussed in depth with the variations of the Spiral Model whichis now espoused in the software community. Barry Boehm and his co-authors discuss this thor-oughly in back issues of CrossTalk.

Bottom-up software estimation is another technique used in many small projects that mayapply to large projects. While it usually is not practical to use bottom-up estimates at the begin-ning of a large project, bottom-up software estimation is certainly applicable as a sanity checkof the estimates as the project progresses, as work is divided into smaller pieces, and as it isassigned to smaller groups and individuals.

And, of course, there is communication. A 2002 CrossTalk article discusses new engineersin large companies sitting in their cubicles and accomplishing little because they don’t want tomake a bother of themselves by asking a lot of questions. Communication tends to be easier onsmall projects, and someone just sitting will certainly be noticed more readily. Providing toolsfor communication such as white boards and open space can be implemented in large compa-nies, and the resulting camaraderie has also been shown to improve productivity.

We start this month’s issue with Fred Smullin’s story of one small project finding its way inthe big Department of Defense world in Navigating The Enterprise Forest. In Development Practicesfor Small Software Applications, Capers Jones shares his insights, comparing the benefits and draw-backs of the Capability Maturity Model Integration (CMMI) with Agile development for smallprojects. We next provide a series of articles with expertise on adapting large processes for smallprojects including Is CMMI Useful and Usable in Small Settings? One Example by Sandra Cepeda,Suzanne Garcia, and Jacquelyn Langhout; Why Do I Need All That Process? I’m Only A Small Projectby Mark Brodnik, Robyn Plouse, and Terry Leip; Small Project Survival Among the CMMI Level 5Big Processes by Alan C. Jost; and Field Guide to Provide Step-by-Step Examples for Improving Processesin Small Settings by Caroline Graettinger, Suzanne Garcia, Christian Carmody, M. Lynn Penn, andWilliam Peterson.

On a side note, CrossTalk will be celebrating the 20th anniversary of our first issue inAugust. We would like to celebrate this milestone with stories of how CrossTalk has helpedour readers save time and money, improve processes, salvage projects, and make your jobs eas-ier. We received several success stories with our survey in 2004 – those responses resulted in thecontinuation of this journal. Your responses now will enable us to thank our co-sponsors fortheir continued support and strengthen our position as we seek additional co-sponsors.

Large projects tend to fail more frequently than small projects for a variety of reasons. Whilesmall projects are figuring out how to benefit from the practices of large projects, large projectscan also learn lessons from small projects. Meanwhile, CrossTalk will continue to offerstrategies for both.

Good Things Come in Small Packages

Elizabeth StarrettPublisher

Page 4: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

4 CROSSTALK The Journal of Defense Software Engineering February 2008

DRILS began as a local maintenancedata collection (MDC) system in

2000 to capture the serialized repair ofassets. It was commissioned by F-16 sup-ply chain managers (SCMs) to connectthem in real time to depot avionics repairactivities at Hill Air Force Base. Theobjective was to collect and analyze vitalrepair shop information in order toincrease the reliability and availability ofF-16 avionics as well as decrease repaircosts [1]. DRILS most importantly facili-tated documentation of individual chips,resistors, and other small parts beingreplaced within the aircraft avionics com-ponents. Other Air Force MDC systemswere not able to provide this level ofdetail, which turned out to be the mostimportant data to SCMs as these partswere the ones that actually failed withinthe avionics components. The informa-tion analyzed from DRILS by the F-16SCMs under their Falcon Flex program [2]has enabled $133 million in cost avoidancesince 2000 and has been projected toachieve approximately $678 millionthrough the aircraft end of life [3].

I will admit that the DRILS stakehold-ers were naïve to the DoD approvalrequirements for an InformationTechnology (IT) system. It would take usmore than a year after fielding our initialprototype to navigate our way and be rec-ognized as a legitimate IT system.

The DRILS story is not unlike thoseof other locally grown data systems in theDoD. An information gap existed withinthe current mix of maintenance informa-tion systems and the depot organizationtook steps to plug that gap which eventu-ally gave birth to DRILS. You do not haveto look far to find information gaps in theDoD. We live in a data rich environment,yet we are information poor; someonesomewhere does not have access to theinformation they need. This basic hungerfor dependable information, as well as thelead time and funding required to modifylegacy systems, leads to creation of local

stovepipe solutions funded and built bythe user in that domain.

Tight budgets are impacting the abilityto implement local solutions. A 2004 U.S.General Accounting Office (GAO) reportfound that the DoD requested approxi-mately $19 billion for fiscal year 2004 tooperate, maintain, and modernize 2,274business systems. The report also identi-fied that uncontrolled DoD spendingresulted in stovepiped and duplicative sys-tems that included more than 200 inven-tory and 450 personnel systems being pro-cured and sustained. Very often thesestovepipe solutions get thrown over thewall to DoD IT organizations to integrateand sustain within the enterprise [4]. Thecosts of the integration and sustainmentefforts result in priorities getting shiftedwithin existing budgets to accommodatethese unplanned requirements.

The National Defense AuthorizationAct (NDAA) of 2005 was crafted tospecifically address this issue [5]. Itrequires the DoD comptroller to deter-mine that system improvements exceeding$1 million meet the criteria specified in theact. DoD portfolio management (PfM)initiatives have been introduced to complywith the NDAA requirement as well aslimit the flow of local solutions that maybe stovepiped or duplicative. However,the door is not completely closed toapproval of local systems. You can getyour local solution approved to operate ifyou know how to navigate your waythrough the forest. Based on my experi-ence, if you take the time to address thefollowing questions, you will increase yourlikelihood of surviving in the enterprise.

What Gaps Are You Filling?Identify what gaps you are filling in theinformation food chain. Is there a reasonwhy no one else is providing the informa-tion? This is your critical foundation uponwhich everything else is built. If you aremerely churning out the same informationthat other systems are producing, you will

not get far. Focus on those informationgaps that prompted you to build the sys-tem.

Take those gaps and then describewhat is in it for the user community, espe-cially the person doing data entry. Are theygetting more out of it than what they putin? If it is not useful to them, you are notgoing to get your dependable data. Make itworthwhile for them, and you will neverbe short of dependable data.

Any requirement, given enough timeand money, can be integrated into anylegacy system. One of the most commonreasons a requirement is not integrated isthat few organizations have the timeand/or money legacy systems request toimplement a new requirement. The usercommunity finds it cheaper and faster inthe near term to implement their require-ments to fill the information gap, andhope to interface it with a legacy systemdown the road. Unfortunately, this iscounterproductive as it just creates morestovepipes.

Try to identify if legacy systems haveplans to plug this information gap, and ifso, when. If there are plans but they areyears out on the horizon, you may getinterim authority to operate your gap-fill-ing solution that could evolve into a long-term solution if the implementation isdone well.

Develop your own comparison matrixof legacy systems that perform similarfunctions or may potentially interface toyour system. Try to be impartial in yourevaluation in order to maximize its credi-bility. Contact the legacy systems and edu-cate them about what you are doing.Approach them with a partnership offerthat assists them with improving the qual-ity of data and information in their sys-tem. Document which systems you wouldinterface with if you could and why.Estimate interface costs and return oninvestment where possible. Interfaces tolegacy systems usually provide returns oninvestment in the areas of duplicate data

Navigating the Enterprise Forest

Fred SmullinIntegratable Technologies, LLC

What happens when you take a local depot solution and promote it for worldwide use? You suddenly find yourself facing thechallenge of integrating into the Department of Defense (DoD) enterprise with few maps to guide you. I will share my lessonslearned as the former chief software architect of the G200 Defense Repair Information Logistics System (DRILS) thatstarted as a local solution and matured into an Air Force solution used both in the .com and .mil domains.

Small Projects, Big Issues

Page 5: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Navigating the Enterprise Forest

February 2008 www.stsc.hill.af.mil 5

entry reduction, minimization of dataentry errors, improved data dependability,and near real-time data updates across theenterprise. Keep in mind that there aretwo costs to any user interface: yours andthe legacy system. Your user communitymay have to pay for both.

In the case of DRILS, we saw that itcould potentially interface with theReliability and Maintainability Infor-mation System (REMIS) and the CoreAutomated Maintenance System (CAMS).Users were frustrated with data entryprocesses and business rules they per-ceived to be cumbersome as well as withchallenges they encountered trying to ana-lyze historical data. We focused our effortson streamlining the data entry and dataanalysis processes for our users. On thedepot shop floor, we were able to cut dataentry time by 80 percent on average com-pared to the legacy system. The result wasthat the volume of depot maintenancedata actually increased from the shopfloor when compared to the legacy datasystem. Interfacing with the legacy sys-tems turned out to be the greatestapproval challenge. It took nearly six yearsof meetings, briefings, and requests forfunding before the official system inter-face was allowed to be built with REMIS.A CAMS interface is still actively beingpursued at this date.

One lesson learned is that headquar-ters is generally willing to entertain tem-porary solutions to initially fill informa-tion gaps. Locally developed temporarysolutions are usually more agile and lesscostly than legacy systems to experimentwith. Thus, temporary solutions are excel-lent proving grounds for defining require-ments to be incorporated into the legacysystem in the long term.

This proved true for DRILS: Our usercommunity encompassed a relatively largeportion of the F-16 avionics communitybut was still small when compared to AirForce-wide MDC legacy systems. Ourdevelopment and support teams were alsomuch smaller which shortened our deci-sion-making time. The architecture designallowed us to isolate experimental mod-ules from all user communities exceptthose participating. Thus, our cost toimplement requested changes for experi-mental initiatives was significantly smallerand our time to value was also significant-ly shorter. This made DRILS an ideal sys-tem to support MDC experiments such asAir Force Serial Number Tracking initia-tives sponsored by high-level champions.

Who Are Your Champions?A key to successfully implementing any IT

system in the DoD is to identify champi-ons within the customer and user base.Champions identified within this commu-nity can help advocate the system at thevarious levels of review and approval.These champions need to be evangelisticbecause their support will be tested up thechain of approval. They will need to beable to communicate their need and whyyour solution is the best.

For example, in the case of DRILS, wewere fortunate that the product conceptand implementation sold itself. Several lev-els of champions sprung up during theproduct implementation. The SCMs want-ed the repair data from the shop floor andconvinced the repair shop supervision togive it a try. Supervision asked their dataentry technicians to try the system. Theydid with some reluctance, but once theystarted entering data, they became believers.

Technicians in depot repair shops aregraded on their production, and the over-all organization is graded on its ability toproduce quality assets on schedule andon budget. Any impediment to thosegoals can impact their customers andcost them workload in their competitiveenvironment. Legacy MDC systems useduntil then had been deemed cumbersometo use by those using it and did not pro-vide any perceivable value to those tech-nicians in meeting quality, budget, andschedule. The DRILS development teamlived on the shop floor for nearly fouryears working hand in hand with theusing technicians to refine how the datawas collected and displayed. This enabledthe system to provide immediate paybackto the person entering the data.

This focus on the technician at thepoint of maintenance enabled them toproactively identify issues that most like-ly would not have been detected with thelegacy systems. A real-world exampleinvolved the F-16 multi-function displayswhose newly manufactured replacementpower supplies that cost $5,000 eachwere failing within five months of instal-lation. The DRILS design enabled thetechnician to easily notice the trend, stopinstalling those parts, and alert the SCMwho triggered an investigation with themanufacturer. That investigation eventu-ally led to the identification of a defect inthe manufacturing process. WithoutDRILS, the trend may have gone on formany more months and possibly ground-ed aircraft due to failed parts cloggingthe supply chain and consuming financialresources.

Stories such as these enabled long-standing issues to start getting fixed.These success stories gained the attentionof the warfighter customer who thenwanted the system adapted for their use.This sold the supervisor who in turn soldtheir Colonel who in turn sold hisBrigadier General. The Brigadier Generalthen raised awareness all the way to theOffice of the Secretary of Defense. Wordstarted to spread between the weapon sys-tems and Major Commands whenwarfighters who had used the systemmoved from unit to unit. It was not longbefore we had obtained several levels ofchampions. But our most critical levelcontinued to be the data entry person atthe point of maintenance.

How Do You Align With theMission?Another key to winning acceptance inthe DoD enterprise is to identify howyou align with the overall IT mission ofyour agency and the DoD in general.Obtain copies of headquarters briefingsin your domain and examine their roadmap and the issues they are trying tosolve. How do you fit within that roadmap? If you can show how you fit with-in that road map, you can gain criticalawareness and possibly acceptance at theheadquarters level. Part of gaining accep-tance is educating them about how youfit with current legacy systems.

Compliance With StandardsThe IT industry continues to evolvetoward a net-centric world where stan-dards-based computing is pushing outproprietary products in order to facilitateeasier integration in heterogeneous envi-

“A key to successfullyimplementing any IT

system in the DoD is toidentify champions withinthe customer and user

base. Championsidentified within thiscommunity can help

advocate the system atthe various levels of

review and approval.”

Page 6: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

6 CROSSTALK The Journal of Defense Software Engineering February 2008

ronments. The DoD continues to gravi-tate to these standards, although slowerthan industry, for the same reasons. It isimportant that you inventory applicabletechnology standards in your applicationdomain as well as those of the missionyou are supporting and remain consistentwith those established standards.

DRILS is a Web-based Air Forcemaintenance data collection system; well-published standards for maintenancedata existed in Technical Order 00-20-2and other publications [6]. We had toremain consistent with those standards ata minimum in order to be able to feeddata to the legacy MDC systems in orderto allow a much broader Air Force audi-ence to analyze the data. Technology-wise, we intentionally selected well-known commercial off-the-shelf prod-ucts and kept those products to a mini-mum in order to avoid integrationheadaches while remaining consistentwith the Global Combat Support System– Air Force requirements.

Can You Participate in aPathfinder Initiative?Pathfinder initiatives are very beneficial.Merriam Webster’s online dictionarydefines pathfinder as, “one that discoversa way; especially: one that explores untra-versed regions to mark out a new route”[7]. My experience with pathfinder initia-tives has involved a charter between a par-ticular community of interest and theheadquarters to solve a process or infor-mation gap. These pathfinders involveassembling members of the community ofinterest to review and improve processesand policies. In order to establish a base-line and measure the effect of change, thepathfinder members must collect data.Thus, appropriate data systems are select-ed as tools to provide the data.

For example, the Air Force decided toinitiate a Reliability Pathfinder to studyand define the benefits of Item UniqueIdentification and Automated Identifica-tion Technology (AIT) in regards to facil-itating serial number tracking within themaintenance processes. The pathfinderteam members analyzed the available datasystems and chose to use DRILS as thetool with which to collect their data. Theyperformed their analysis on selected B-52avionics maintenance occurring on theflight line and at the depot. DRILS wasused basically as is with a few minor soft-ware modifications to facilitate specificdata collection and analysis. The result isthat the Air Force Reliability Pathfinderhas proven very successful. Reports are

currently being prepared that have thepotential to positively impact the future ofserialized asset tracking.

What I learned while participating inthree Air Force and two joint Air Forceand Army service pathfinder initiatives isthat they are useful for unifying a vision.Headquarters depends on field users todefine requirements for them. Users wantheadquarters to make decisions andinvestments that will improve their workenvironment, but often do not know howto effectively communicate requirements.I saw disconnects occurring on both sides.

A disconnect may occur in the under-standing of the big picture at the userlevel, while the headquarters may notcompletely understand the detailed needsof the user. A pathfinder initiative pro-vides an excellent forum for these twogroups to collaborate in a closed environ-ment, reach a common understanding,solve longstanding issues, and communi-cate those solutions to all parties.

To participate in a pathfinder, youneed to apply your gap assessment, thebacking and breadth of your champions,and your legacy system comparisons tomake your case to headquarters of howyou can help with a pathfinder effort. Ifyou can team with an existing legacy sys-tem to solve the pathfinder needs, thenyou have significantly increased your oddsof being approved.

Pathfinder efforts are as resource-chal-lenged as any other program. Therefore,financial resources to support your effortswill be very limited. However, the expo-sure and lessons learned from a successfulpathfinder effort are significant.Pathfinder progress reports are reviewedat the highest management levels. A suc-cessful pathfinder effort will often lead toother pathfinders that increase the expo-sure of your system as well as your accep-tance within the DoD IT community.

Do You Have PortalCapability?How many user names and passwords doyou currently maintain? Do you thinkusers will be willing to add your system tothe list as well? I am personally aware ofan office that did a Lean study and foundthey lost 1.5 hours of productivity perday logging in and out of 22 data systemsto do their job. This frustrated the work-ers and decreased their overall job satis-faction.

You can increase your probability ofuser acceptance by checking to see ifthere is a portal such as the Air ForcePortal or Army Knowledge Online(AKO) that you can integrate with toprovide streamlined sign-on capability.Check with your portal for specificrequirements. Interfacing with a portalwill also demonstrate your ability to inte-grate within the enterprise, and decisionmakers will often sway your way whencompared to a non-integrated system.

In the case of DRILS, we were able tointegrate the application with the AirForce Portal that streamlined sign-on formany of our DoD users. It also allowed usto extend use of the application to select-ed F-16 DoD repair contractors in the.com world that facilitated increased visi-bility of F-16 avionics repairs worldwide.

The DRILS Air Force Portal integra-tion went fairly smoothly with only rela-tively minor edits to our authenticationprocess. We did encounter policy chal-lenges that we felt had to be overcome.We had several hundred users whodepended on the application for depotproduction. If the portal went offline, werisked not collecting valuable data as pro-duction would continue, but data capturemay not catch up. Thus, we still wantedour users to be able to access the system.The Air Force Portal policy was that ourapplication authentication must berestricted to portal users and deny directaccess and login via our non-portal Webaddress. It took a few e-mails and confer-ence calls as well as a formal waiver

“A pathfinder initiativeprovides an excellent

forum ... to collaborate ina closed environment,

reach a commonunderstanding, solve

longstanding issues, andcommunicate those

solutions to all parties. Ifyou can team with an

existing legacy system ...then you have significantlyincreased your odds of

being approved.”

Page 7: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Navigating the Enterprise Forest

February 2008 www.stsc.hill.af.mil 7

request to gain approval for a hybridsecurity model that would allow both AirForce Portal and manual authentication.This allowed us to ensure maximumavailability to at least our .mil users.Those in the .com world would have toremain dependent on the Air ForcePortal availability.

Have You Done YourPaperwork?I dislike doing paperwork as much as thenext person. Unfortunately, paperwork isjust part of the territory when it comes tobuilding and fielding a DoD IT system.Recent NDAA legislation leaves you withlittle choice. You risk incurring stiff finan-cial and judicial penalties if you do notcomplete your paperwork.

The following questions address thetwo main documentation areas that shouldbe common across the DoD. Each agencymay impose additional requirements. Youwill need to check with your respectiveagency for details.

AreYou Registered With PfM?PfM is your required first approval stopfor any local or global data system. PfM isthe management of selected groupings ofinvestments using integrated strategicplanning, integrated architectures, perfor-mance measures, risk-management tech-niques, transition plans, and portfolioinvestment strategies. The PfM process isdriven by a number of legislative acts andDoD directives.

At the root of PfM is the Clinger-Cohen Act of 1996 that requires agenciesto use a capital planning and investmentcontrol process to provide for selection,management, and evaluation of ITinvestments [8]. Consequently, the DoDpublished Directive 8115.01, Infor-mation Technology Portfolio Manage-ment, to establish policy and assignresponsibility for the management ofDoD IT investments as portfolios thatfocus on improving DoD capabilities andmission outcomes [9]. DoD Directive8000.1, Management of DoD Infor-mation Resources and InformationTechnology, establishes the requirementfor a Chief Information Officer (CIO)role in the agencies to manage these port-folios. The CIOs designate portfoliomanagers to manage their portfolios [10].Portfolio managers interact with DoD ITsystem program managers to report thestatus of their programs.

The DoD Enterprise InformationTechnology Portfolio Repository(DITPR) is one system used to track

portfolios. DITPR was selected by theDoD CIO as the enterprise shared spacefor IT PfM data for all DoD business ITsystems. However, each branch has itsown methods of IT registry that feed toDITPR. The Air Force uses theEnterprise Information Technology DataRepository (EITDR), the Navy andMarines use the DITPR-DON(Department of Navy) system, and theArmy uses the Army PortfolioManagement System (APMS) as theirregistry. All of these systems are used torecord investment review and certifica-tion submission information, FederalInformation Security Management Act(FISMA) of 2002 assessments, and more[11]. The IT Lean acquisition process andsecurity, interoperability, supportability,sustainability, and usability processes areintegrated into these systems as well.

These systems are necessary in orderto provide portfolio managers access toinformation needed to do the following:maximize value of IT investments whileminimizing risk, improve communicationand alignment between IT and DoDleaders, facilitate team thinking versusindividual commands or units, enablemore efficient use of assets, reduce thenumber of redundant projects and elimi-nate non-value added projects, and sup-port an enterprise IT investmentapproach.

Portfolio managers depend on ITprogram managers to keep the portfoliodata current for their respective systemsin order to fulfill their goals. Participationin PfM is not an option. There are seriousconsequences for not complying with allof the PfM requirements.

If you are building a new system orexpanding the capability of an existingsystem, you must submit a capabilityrequest to your respective portfolio man-ager to get authorization. This is whereyou once again tap into your foundation-al data that describes the gaps you are fill-ing. You may need to arrange a meetingwith your portfolio manager to describewhy you need the capability and that asimilar capability does not exist. Theportfolio manager has to weigh a lot ofcriteria when making a decision to autho-rize your request and may request addi-tional data to reach their decision. Youmay even be invited to a fly off before aboard who is evaluating similar systemswithin the portfolio.

Do You Meet the InformationAssurance Requirements? One of the first things that a portfolio

manager will evaluate is whether you com-ply with mandatory information assurance(IA) requirements for your system. IA ismore important today than it ever hasbeen; information warfare attacks are areality. FISMA requires each federalagency to develop, document, and imple-ment an agency-wide program to provideinformation security for the informationand information systems that support theoperations and assets of the agency [11].

In order to comply with FISMArequirements, the DoD has created theDefense Information Assurance Certi-fication and Accreditation Process (DIA-CAP) that replaced the Defense Infor-mation Technology Security Certificationand Accreditation Process. DIACAPassigns, implements, and validates DoDI8500.2 standardized IA controls and man-ages IA posture across DoD informationsystems consistent with FISMA legislativepolicy as well as DoD regulatory policyfound in the 8500 series of directives [12].

You need to ensure that your systemremains compliant with the IA require-ments identified in the DoD IA 8500series of directives. If you do not, thenyou will not be authorized to operate onthe DoD network or interface to legacysystems. DoDI 8500.2 assigns IA controlsto three Mission Assurance Categories(MAC) and three data sensitivity levels.You will need to evaluate your system andselect one MAC and one data sensitivitylevel appropriate to your system and mis-sion that will determine what your IA con-trol requirements are.

Once you have shown that you complywith the IA control requirements, youmust submit a Certification andAccreditation (C&A) package to yourDesignated Approval Authority forapproval using the DIACAP workflow.When your package is approved, anAuthority to Operate will be issued that isvalid for three years from the date it isissued. DIACAP also requires annualsecurity reviews of the C&A package andthose reviews are reported to the portfoliomanager through the appropriate portfo-lio registry such as EITDR, DITPR-DON, or APMS.

Complying with mandatory IArequirements is just one piece of the secu-rity puzzle. You need to also ensure thesystem is programmed defensively usingsecure coding techniques to ensure thatyour system and its information are notcompromised. Web application security isconsidered a weak point in an IT securitywall and subject to information warfareattack.

The Defense Information Systems

Page 8: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

8 CROSSTALK The Journal of Defense Software Engineering February 2008

Agency has published a Security TechnicalImplementation Guide titled “ApplicationSecurity and Development Security.” Thiscan be downloaded at <http://iase.disa.mil>.

ConclusionIt is possible to grow a local product intoan enterprise system with today’sincreased PfM and IA requirements. Westarted DRILS in July of 2000, deliveredour rapid prototype in September of2000, and used evolutionary develop-ment from that point forward. Duringmy six years as chief architect, I saw a lotof transformation on how IT systems arecertified and supported. I am glad to saythat it is becoming less of a paperworkdrill now. However, there are still a lot ofsteps to be checked off. You still have todo your homework and some paperworkto lay your foundation in order to educateyour user community, champions, andportfolio managers on why your systemshould exist.

Align yourself wherever possible withthe goals, objectives, and standards ofyour agency and the DoD in general.Pathfinders are an excellent avenue toprove your alignment, increase your visi-bility, and gain acceptance at the head-quarters level. You can further demon-strate your capability to integrate in theenterprise by facilitating streamlinedsign-on to your application through aportal such as the Air Force portal orAKO.

Getting integrated into the DoD enter-prise is not only a technology challenge butalso a challenge of navigating the approvalprocess. However, with the proper prepa-ration the approval process will be mucheasier to navigate successfully.u

References1. Lindsey, Capt. Greg, and Kevin Berk.

“Serialized Maintenance Data Collec-tion Using DRILS.” CrossTalkOct. 2003 <www.stsc.hill.af.mil/CrossTalk/2003/10/0310lindsey.pdf>.

2. Berk, Kevin. “Falcon Flex: TurningMaintenance Information into AirPower.” Defense AT&L July-Aug.2007 <www.dau.mil/pubs/dam/2007_07_08/lebr_ja07.pdf>.

3. Total Quality Systems. “UID/FalconFlex Transforming Sustainment.”Proc. of the U.S. Air Force UID/AITConference 2007 <www.dla.mil/j-6/AIT/Conferences/USAF_UID-AIT_Conference/default.aspx>.

4. U.S. GAO. DoD Business SystemsModernization – Billions Continue toBe Invested With Inadequate Manage-

ment Oversight and Accountability.GAO-04-615. Washington: GAO, 2004.

5. Congress of the United States ofAmerica. Ronald W. Reagan NationalDefense Authorization Act for FiscalYear 2005. 108th Congress, 2004<http://thomas.loc.gov/cgi-bin/query/z?c108:H.R.4200.enr:>.

6. U.S. Air Force. “Technical Order 00-20-2, Maintenance Data Documen-tation.” Apr. 2007.

7. Merriam Webster <www.merriamwebster.com/dictionary/ pathfinder>.

8. Congress of the United States ofAmerica. Clinger-Cohen Act of 1996.104th Congress, 1996 <www.cio.gov/Documents/it_management_reform_act_Feb_1996.html>.

9. DoD. “DoDD 8115.01, InformationTechnology Portfolio Management.”Oct. 2005 <www.dtic.mil/whs/directives/corres/pdf/811501p.pdf>.

10. DoD. “DoDD 8000.1, Management ofDoD Information Resources andInformation Technology.” Feb. 2002< w w w. j s . p e n t a g o n . m i l / w h s /directives/corres/pdf/800001p.pdf>.

11. United States. “Federal InformationSecurity Management Act (FISMA).”2002 <www.whitehouse.gov/omb/egov/g-4-act.html>.

12. DoD. “DoDD 8500.2, InformationAssurance Implementation.” Feb.2003 <www.dtic.mil/whs/directives/corres/pdf/850002p.pdf>.

About the Author

Fred Smullin is founderand president of Inte-gratable Technologies,LLC. He has beeninvolved in developingsoftware for DoD cus-

tomers over the past 18 years, as well asconsulting internationally on commercialsoftware projects. Smullin served as theChief Software Architect of the G200DRILS from 2000 to 2006 beforelaunching Integratable Technologies,LLC. His passion is researching how tomake enterprise integration easier.

Integratable Technologies, LLC1436 Legend Hills DRSTE 105Clearfield, UT 84015Phone: (801) 779-1035Fax: (801) 779-1057E-mail: fsmullin@integratable

tech.com

COMING EVENTS

March 3-7SD West 2008 Software Development

Conference and Expo WestSanta Clara, CA

http://sdexpo.com

March 4-5Warfighter’s Vision 2008

Tampa, FLwww.afei.org

March 11-122008 Military and Aerospace

Electronics ForumSan Diego, CA

http://mtc08.events.pennnet.com/fl//index.cfm

March 16-182008 Engineering Research Council

Summit, Workshop and ForumArlington, VA

www.asee.org/conferences/erc/2008/index.cfm.

March 17-202008 SEPGTampa, FL

www.sei.cmu.edu/sepg

March 18-20Sea - Air - Space 2008

Washington, D.C.www.sasexpo.org/2008

March 24-28International Testing Certification

Super WeekChicago, IL

www.testinginstitute.com

April 29-May 2

2008 Systems and SoftwareTechnology Conference

Las Vegas, NVwww.sstc-online.org

COMING EVENTS: Please submit coming events thatare of interest to our readers at least 90 daysbefore registration. E-mail announcements to:[email protected].

Page 9: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

February 2008 www.stsc.hill.af.mil 9

Software applications come in a widerange of sizes and types. Each size

and type tends to have evolved typicaldevelopment practices. For example,military projects are usually much moreformal and perform many more over-sight and control activities than civilianprojects. Systems software tends to havemore kinds of testing and quality con-trol activities than information systems.

When application sizes are consid-ered, there are very significant differ-ences in the development processes thatare most commonly used for large soft-ware projects compared to small soft-ware projects.

Before discussing size differences, itis best to start by defining what is meantby the terms large and small. Althoughthere is no agreed-to definition for theseterms, the author generally regards largeapplications as being those whose sizeexceeds 10,000 function points or about1,000,000 source code statements. Theauthor regards small applications asthose whose size is at or below 1,000function points or about 100,000 sourcecode statements.

It is a significant observation that outof 16 software lawsuits where the authorserved as an expert witness, 15 of themwere for applications in the 10,000 func-tion point size range, or larger. Thesmallest application noted during litiga-tion was 3,000 function points.

Usually software applications of1,000 function points or below are fairlytrouble-free and, therefore, seldom endup in litigation for outright failure, qual-ity problems, cost overruns, or schedule

slips. On the other hand, an alarmingpercentage of applications larger than10,000 function points exhibit seriousquality problems, outright failure, ormassive overruns.

Some readers might think that 1,000function points or 100,000 source codestatements are still rather large.However, there are very few commercialsoftware applications or business appli-

cations that are smaller than this sizerange. Software in the range of 100function points or 10,000 source codestatements usually consists of enhance-ments to larger applications rather thanstand-alone applications. Even thesmallest commercial off-the-shelf soft-ware packages are in the range of 1,000function points.

The overall size range of softwareapplications noted by the author runsfrom a high of about 300,000 functionpoints (30,000,000 source code state-ments) down to about 0.1 functionpoints (10 source code statements) for asmall bug repair.

At the extreme high end are very fewmassive applications such as enterpriseresource planning (ERP) packages andsome large defense systems. Operatingsystems and major application packagessuch as Microsoft Office are in the rangeof 100,000 function points or10,000,000 source code statements.Various components of MicrosoftOffice such as Excel, Word, PowerPoint,etc., are between 5,000 and 10,000 func-tion points in size.

Origins of the CMM andAgile DevelopmentThe SEI was incorporated in 1984 and islocated at Carnegie Mellon University inPittsburgh, Pennsylvania. The SEI wasoriginally funded by the DefenseAdvanced Research Projects Agency.

Some of the major concerns that ledto the creation of the SEI were the veryserious and common problems associat-ed with large software projects. One ofthe early, major, and continuing activitiesof the SEI was the development of aformal evaluation or assessment schemafor evaluating the capabilities of defensecontractors, which was termed CMM.Watts Humphrey’s book on this topic,“Managing the Software Process”became a bestseller [1].

The initial version of the CMM waspublished by SEI in 1987, and continuedto grow and evolve for about 10 years.Development of the CMM more or lessstopped in 1997 and the emphasisswitched to the next generation, or theCMMI. The CMMI was initially pub-lished in 2002 and has been updated sev-eral times. Associated with the CMMIare Watts Humphrey’s Team SoftwareProcessSM (TSPSM) and Personal SoftwareProcessSM (PSPSM).

The SEI assessment approach is now

Development Practices for Small Software Applications©

Because large software projects are troublesome and often out get out of control, a number of effective development methodshave been created to help reduce the problems of large software projects. Two of these methods include the well-knownCapability Maturity Model® (CMM®) and the newer CMM IntegrationSM (CMMI®) of the Software Engineering Institute(SEI). When these methods are applied to small projects, it is usually necessary to customize them because, in their originalform, they are somewhat cumbersome for small projects, although proven effective for large applications. More recently, alter-nate methods derived from smaller software projects have become popular under the general name of Agile software develop-ment. Since the Agile methods originated for fairly small projects, they can be applied without customization and often returnvery positive results. It is possible to merge Agile concepts with CMM and CMMI concepts, but projects cited in this articledid not do so because such mergers are comparatively rare.

Capers JonesSoftware Productivity Research, LLC

“When application sizesare considered, thereare very significantdifferences in the

development processesthat are most commonlyused for large softwareprojects compared to

small software projects.”

© 2007 by Capers Jones. All rights reserved.® Capability Maturity Model, CMM and CMMI are regis-

tered in the U.S. Patent and Trademark Office by CarnegieMellon University.

SM CMM Integration, Team Software Process, PersonalSoftware Process, TSP, and PSP are service marks ofCarnegie Mellon University.

Page 10: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

10 CROSSTALK The Journal of Defense Software Engineering February 2008

well documented and well covered insoftware literature. Indeed, AddisonWesley Longman has an entire seriesdevoted to SEI assessments. The SEIassessment data is collected by means ofon-site interviews using both a standardquestionnaire and also observations andinformal data.

Because of the importance of verylarge systems to the Department ofDefense, the SEI assessment approachoriginally dealt primarily with the soft-ware processes and methodologies usedby large companies that produced largesystems. The original SEI assessmentapproach was derived from the bestpractices used by leading corporationssuch as IBM and ITT, which employfrom 5,000 to more than 50,000 soft-ware professionals, and which couldsafely build systems in excess of1,000,000 lines of code or 10,000 func-tion points.

Based on the patterns of answers tothe SEI assessment questions, the finalresult of the SEI assessment processplaces the software organization on oneof the levels of a five-point maturityscale. The five plateaus of the SEI matu-rity levels are shown in Table 1.

It is immediately obvious that thedistribution of software organizations isskewed toward the low end of the scale.A similar kind of skew would occur ifyou were to look at the distribution ofcandidates selected to enter theOlympics for events such as downhillskiing. Most ordinary citizens could notqualify at all. Very few athletes couldmake it to the Olympic tryouts, evenfewer would represent their countries,and only three athletes from around theworld will win medals in each event.

As data is collected, it becomes evi-dent that there is quite a bit of overlapamong the various SEI maturity levels.For example, in terms of both qualityand productivity, the best software pro-jects from Level 1 organizations can besuperior to the worst developed by Level3 organizations, although the statisticalaverages of Level 3 are far better thanthose of Level 1.

There is now fairly solid evidenceabout the CMM from many studies.

When organizations move from CMMLevel 1 up to Level 2, 3, 4, and 5, theirproductivity and quality levels tend toimprove based on samples at each level.

As of 2007, the newer CMMI hasless empirical data than the older CMM,which is not surprising given its morerecent publication date. However, theTSP and PSP methods do have enoughdata to show that they are successful inimproving both quality and productivityat the same time.

When the CMM originated in the1980s, the waterfall method of develop-ment was the most common at that timeand was implicitly supported by mostcompanies that used the early CMM.However, other methods such as spiral,

iterative, etc., were quickly included inthe CMM as well. The CMM is neutral asto development methods, but among theauthor’s clients who adopted the CMM,about 80 percent also used the waterfallmethod.

What the CMM provided was a solidframework of activities, much betterrigor in the areas of quality control andchange management, and much bettermeasurement of progress, quality, andproductivity than was previously thenorm.

The history of the Agile methods isnot as clear as the history of the CMMbecause the Agile methods are some-what diverse. However, in 2001 thefamous Agile Manifesto was published[2]. This provided the essential princi-ples of Agile development. That beingsaid, there are quite a few Agile varia-tions including eXtreme Programming(XP), Crystal Development, AdaptiveSoftware Development, Feature DrivenDevelopment, and several others.

Some of the principle beliefs foundin the Agile Manifesto include the fol-lowing:• Working software is the goal, not

documents.• Working software is the principle

measure of success.• Close and daily contact between

developers and clients is necessary.• Face-to-face conversation is the best

form of communication.• Small, self-organizing teams give the

best results.• Quality is critical, so testing should

be early and continuous.The Agile methods, the CMM, and

the CMMI are all equally concernedabout three of the same fundamentalproblems:1. Software requirements always

change.2. Fixing software bugs is the most

expensive software activity in history.3. High quality leads to high productiv-

ity and short schedules.However, the Agile method and the

CMM/CMMI approach draw apart ontwo other fundamental problems:1. Paperwork is the second most expen-

sive software activity in history.2. Without careful measurements, con-

tinuous progress is unlikely.The Agile method takes a strong

stand that paper documents in the formof rigorous requirements and specifica-tions are too slow and cumbersome tobe effective. In the Agile view, dailymeetings with clients are more effectivethan written specifications. In the Agileview, daily team meetings or Scrum ses-sions are the best way of trackingprogress, as opposed to written statusreports. The CMM and CMMI do notfully endorse this view.

The CMM and CMMI take a strongstand that measurements of quality, pro-ductivity, schedules, costs, etc., are a nec-essary adjunct to process improvementand should be done well. In the view ofthe CMM and CMMI, it is hard to provethat a methodology is a success or notwithout data that demonstrates effective

Table 1: Five Levels of the CMM

SEI Maturity Level Meaning Frequency of Occurrence

1 = Initial Chaotic 75.0% 2 = Repeatable Marginal 15.0% 3 = Defined Adequate 8.0% 4 = Managed Good to excellent 1.5% 5 = Optimizing State of the art 0.5%

Table 2: Comparison of Agile and CMM Results for an Application of 1000 Function Points

Table 1: Five Levels of the CMM

“What the CMMprovided was a solid

framework of activities,much better rigor inthe areas of qualitycontrol and change

management, and muchbetter measurement ofprogress, quality, andproductivity than waspreviously the norm.”

Page 11: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Development Practices for Small Software Applications

February 2008 www.stsc.hill.af.mil 11

progress. The Agile method does notfully endorse this view. In fact, one ofthe notable gaps in the Agile approach isany quantitative quality or productivitydata that can prove the success of themethods.

Differences in DevelopmentActivities Between the Agileand CMM/CMMI MethodsIn many industries, building a large prod-uct is not the same as building a smallproduct. Consider the differences in spe-cialization and methods required to builda wooden kayak versus building an80,000-ton cruise ship. A kayak can beconstructed by a single individual usingonly hand tools, but a large, moderncruise ship requires more than 500 work-ers including specialists such as pipe fit-ters, electricians, steel workers, welders,painters, interior decorators, air condi-tioning specialists, and many others.

Software follows a similar pattern:Building a large system in the 10,000function point range is more or lessequivalent to building other large struc-tures such as ships, office buildings, orbridges. Many kinds of specialists areutilized and the development activitiesare quite extensive compared to smallerapplications.

Because the CMM approach wasdeveloped in the 1980s when the water-fall method was common, it is not diffi-cult to identify the major activities thatare typically performed. For an applica-tion of 1,000 function points (approxi-mately 100,000 source code statements),the following 20 normal activities werenoted among the author’s clients whoused either the CMM or CMMI:1. Requirements.2. Prototyping.3. Architecture.4. Project planning and estimating.5. Initial design.6. Detailed design.7. Design inspections.8. Coding.9. Reuse acquisition.10. Code inspections.11. Change and configuration control.12. Software quality assurance.13. Integration.14. Test plans.15. Unit testing.16. New function testing.17. Regression testing.18. Integration testing.19. Acceptance testing.20. Project management.

Using the CMM and CMMI, the

entire application of 1,000 functionpoints would have the initial require-ments gathered and analyzed, the speci-fications written, and various planningdocument produced before coding gotunder way.

By contrast, the Agile method ofdevelopment would follow a differentpattern. Because the Agile goal is todeliver running and usable software toclients as rapidly as possible, the Agileapproach would not wait for the entire1,000 function points to be designedbefore coding started.

What would be most likely with theAgile methods would be to divide theoverall project into four smaller projects,each of about 250 function points insize. In Agile terminology, these smallersegments are termed iterations or some-times sprints.

However, in order to know what theoverall general set of features would be,an Agile project would start with Iteration0 or a general planning and require-ments-gathering session. At this session,the users and developers would scopeout the likely architecture of the applica-tion and then subdivide it into a numberof iterations.

Also, at the end of the project when

all of the iterations have been complet-ed, it will be necessary to test the com-bined iterations at the same time.Therefore, a release phase follows thecompletion of the various iterations. Forthe release, some additional documenta-tion may be needed. Also, cost data andquality data need to be consolidated forall of the iterations. A typical Agiledevelopment pattern might resemble thefollowing:• Iteration 0

1. General overall requirements.2. Planning.3. Sizing and estimating.4. Funding.

• Iterations 1-41. User requirements for each itera-

tion.2. Test planning for each iteration.3. Testing case development for

each iteration.4. Coding.5. Testing.6. Scrum sessions.7. Iteration documentation.8. Iteration cost accumulation.9. Iteration quality data.

• Release1. Integration of all iterations.2. Final testing of all iterations.

Successful Hybrid Approaches toSoftware Development

One of the major trends in the industry since about 1997 has been to couple the mosteffective portions of various software development methodologies to create hybridapproaches. These hybrid methods are often successful and often achieve higher qual-ity and productivity rates than the original pure methods. As of 2007 some interestingexamples of the hybrid approach include, in alphabetical order:• Agile joined with object-oriented development (OOD).• Agile joined with Lean Six-Sigma.• Agile joined with TSP and PSP.• Agile joined with the CMM and CMMI.• CMM and CMMI joined with Six-Sigma.• CMM and CMMI joined with International Standardization for Organization (ISO)

standard certification.• CMM and CMMI joined with Information Technology Infrastructure Library (ITIL).• CMM and CMMI joined with Quality Function Deployment (QFD).• CMM and CMMI joined with TSP and PSP.• XP joined with Lean Six-Sigma.• ITIL joined with Six-Sigma.• ISO joined with TickIT.• OOD joined with service-oriented architecture (SOA).• Six-Sigma joined with QFD.• Six-Sigma joined with ITIL.• Six-Sigma joined with TSP/PSP.• SOA joined with ITIL.• TickIT joined with Six-Sigma.The list above only links pairs of development methods that are joined together. Fromtime to time three or even four or five development practices have been joined togeth-er:• Agile joined with OOD, joined with Lean Six-Sigma, and joined with ITIL.• CMM joined with Agile, joined with TSP/PSP, joined with Six-Sigma, and joined with

QFD.

Page 12: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

12 CROSSTALK The Journal of Defense Software Engineering February 2008

3. Acceptance testing of applica-tion.

4. Total cost accumulation.5. Quality data accumulation.6. Final Scrum session.These are the most interesting and

unique features of the Agile methods: 1)The decomposition of the applicationinto separate iterations, 2) The dailyface-to-face contact with one or moreuser representatives, 3) The daily Scrumsessions to discuss the backlog of workleft to be accomplished and any prob-lems that might slow down progress.Another interesting feature is to createthe test cases before the code itself iswritten, which is a feature of XP.

Comparative Results of Agileand CMM for an Applicationof 1,000 Function PointsThere is much more quantitative dataavailable on the results of projects usingthe CMM than for projects using theAgile methods. In part this is becausethe CMM and CMMI include measure-ment as a key practice. Also, while Agileprojects do estimate and measure, someof the Agile metrics are unique and haveno benchmarks available as of 2007. Forexample some Agile projects use storypoints, some use running tested features,some use ideal time, and some measurewith use case points. There are no largecollections of data using any of these

metrics nor are there reliable conversionrules to standard metrics such as func-tion points.

For this article, an artificial test bedwill be used to examine the comparativeresults of Agile and the CMM. The testbed is based on a real application (anonline survey tool), but the size has beenarbitrarily set to exactly 1,000 functionpoints or 50,000 Java statements.

A sample of 20 CMM projects exam-ined by the author and his colleagueswas condensed into an overall aggregatefor the CMM results. For the Agile data,observations and discussions with prac-titioners on five projects were used toconstruct the aggregate results. This isnot a particularly accurate and reliableway to perform comparative analysis. Asit happens, none of the 20 CMM appli-cations used Agile methods, nor did anyof the Agile projects use aspects of theCMM or CMMI.

Merging aspects of the Agile andCMM concepts is not difficult. Indeed,some CMM and CMMI applicationscirca 2007 do use Agile features such asScrum sessions. However, among theauthor’s clients, about 85 percent ofCMM/CMMI users do not use Agileand about 95 percent of Agile users donot make use of either the CMM orCMMI. For example, at a recent confer-ence of more than 100 state softwaremanagers, more than 50 percent of cur-rent projects were using Agile methods –

but no projects at all were using eitherthe CMM or CMMI.

Though briefly discussed in the side-bar, the topic of hybrid approaches isnot well covered in the software engi-neering literature. Neither is the topic ofhybrid approaches well covered in termsof benchmarks and quantitative data,although some hybrid projects are pre-sent in the International SoftwareBenchmark Standards Group data. Tosimplify the results in this article, theAgile methods were used in pure form aswere the CMM and CMMI methods.Since the pure forms still outnumberhybrid approaches by almost 10-to-1,that seems like a reasonable way to pre-sent the data.

The sizes of the samples were not, ofcourse, exactly 1,000 function points.They ranged from about 900 to almost2,000 function points. Not all of thesamples used Java either. Therefore thefollowing comparison in Table 2 has asignificant but unknown margin oferror.

What the comparison does provide isside-by-side data points for a standardtest bed, with the measurements forboth sides expressed in the same units ofmeasure. Hopefully future researchersfrom both the Agile and CMM/CMMIcamps will be motivated to correct theresults shown here, using better meth-ods and data.

Note that the lines of code (LOC) met-ric is not reliable for economic studiessince it penalizes high-level program-ming languages and cannot be used forcross-language comparisons. However,since many people still use this metric, itis provided here with the caveat thatnone of the LOC values would be thesame for other programming languagessuch as C++, Smalltalk, Visual Basic,etc.

The data in Table 2 is expressed interms of function point metrics andassumes version 4.2 of the countingrules published by the InternationalFunction Point Users Group.

The CMM version of the project isassumed to be developed by an organi-zation at Level 3 on the CMM. TheAgile version of the project is assumedto be developed by an experienced Agileteam. However, the Agile version isassumed not to use either the CMM orCMMI and so has no level assigned.This is a reasonable assumption becausevery few Agile teams are also CMM cer-tified.

Although one comparison is proba-bly insufficient, observations of many

1

Table 2: Comparison of Agile and CMM Results for an Application of 1000 Function Points

Agile CMM Level 3 Difference

Size in function points 1,000 1,000 0Size in Java code statements 50,000 50,000 0Monthly burdened cost $7,500 $7,500 0

231231htnomrepsruohkroW 0

5ffatstcejorP 7 2Project effort (months) 66 115 49Project effort (hours) 8,712 15,180 6,486Project schedule (months) 14 19 5

000,594$tsoctcejorP $862,500 $367,500

Function points per month 15.15 8.67 -6.46Work hours per function point 8.71 15.18 6.47LOC per month 758 435 -323Function point assignment scope 200 143 -57LOC assignment scope 10,000 7,143 -2,857

863$368$594$tniopnoitcnufreptsoC52.71$09.9$COLreptsoC $7.35

005,4052,4laitnetoptcefeD 250Defect potential per function point 4.25 4.50 0.25Defect removal efficiency 90% 95% 0.5%Delivered defects 425 225 -200

821stcefedytireves-hgiH 68 -60

Delivered defects per function point 0.43 0.23 0.20Delivered defects per thousand LOC 8.50 4.50 -4.00

Table 2: Comparison of Agile and CMM Results for an Application of 1,000 Function Points

Page 13: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Development Practices for Small Software Applications

February 2008 www.stsc.hill.af.mil 13

small projects in the 1,000 functionpoint size range indicate that the CMMand CMMI are somewhat more cumber-some than the Agile methods and there-fore tend to have somewhat lower pro-ductivity rates (see Table 2). That is notto say that the CMM/CMMI methodsproduce bad or unacceptable results, butonly that the Agile methods appear togenerate higher productivity levels.

However, if the size plateau for com-parison is upped to 10,000 functionpoints or 100,000 function points, theCMM/CMMI methods would pull ahead.At 10,000 function points the overall staffwould be larger than 50, while for100,000 function points the overall staffwould approach 600 people and some ofthe Agile principles such as daily teammeetings would become difficult and per-haps impossible.

Also, applications in the 10,000 to100,000 function point size range tend tohave thousands or even millions of users.Therefore it is no longer possible to havedirect daily contact with users since theirnumbers are too large.

When quality is considered, the Agileapproach of having frequent daily con-tacts with users, daily Scrum sessions, andwriting the test cases before the code hasthe effect of lowering defect potentials. Thephrase defect potentials refers to the totalnumber of defects that might be encoun-tered in requirements, design, coding,user documents, as well as bad fixes or sec-ondary defects. (The current U.S. averagewould be about 5.0 defects per functionpoint for defect potentials.) However,Agile projects usually do not performformal design and code inspections sotheir defect removal efficiency is some-what below the CMM example.

The defect potentials for the CMMversion are better than U.S. averages butnot quite equal to the Agile version dueto the more detached connectionsbetween clients and developers.

However, the CMM and CMMImethods typically do utilize formal designand code inspections. As a rule of thumb,formal inspections are more than 65 per-cent efficient in finding bugs or defects,which is about twice the efficiency ofmost forms of testing, many of whichonly find about 30 percent of the bugsthat are actually present.

As of 2007 the current U.S. averagefor cumulative defect removal efficiencyis only about 85 percent, so both theAgile and CMM examples are better thanU.S. norms. The CMM method is some-what higher than the Agile method due toformal inspections, testing specialists, and

a more formal quality assuranceapproach.

As a general rule, methods developedto support large software applicationssuch as the CMM and CMMI are cum-bersome when applied to small projects.They can be tailored or subset to fit smallprojects, but out of the box they are cum-bersome.

On the other hand, methods devel-oped to support small software applica-tions such as the Agile methods becomeless and less effective as application sizesincrease. Here too, they can be tailored ormodified to fit larger projects, but out ofthe box they are not effective.

As of 2007 both the Agile andCMM/CMMI communities are workingon merging the more useful features ofboth sets of methods. Agile and theCMM/CMMI are not intrinsicallyopposed to one another, they merelystarted at opposite ends of the size spec-trum.

Overall Observations onSoftware DevelopmentWith more than 13,000 projects exam-ined, it can be stated categorically thatthere is no single method of develop-ment that is universally deployed or evenuniversally useful. All software projectsneed some form of gathering require-ments, some form of design, some kindof development or coding, and someforms of defect removal such asreviews, inspections, and testing.

In the author’s studies more than 40methods for gathering requirements,more than 50 variations in handling soft-ware design, more than 700 program-ming languages, and more than 30 formsof testing were noted among the pro-jects examined. Scores of hybrid meth-ods have also been noted.

Both the Agile methods and theCMM/CMMI methods have addedvalue to the software industry. But careis needed to be sure that the methodsselected are in fact tailored to the sizerange of the applications being con-structed.u

References1. Humphrey, Watts S. Managing the

Software Process. Reading, MA:Addison-Wesley Longman, 1989.

2. Cockburn, Alistair. Agile SoftwareDevelopment. Boston, MA: AddisonWesley, 2001.

Additional Reading1. Caputo, Kim. CMM Implementation

Guide. Boston, MA: Addison Wesley,1998.

2. Garmus, David and David Herron.Measuring the Software Process: APractical Guide to FunctionalMeasurement. Englewood Cliffs, NJ:Prentice Hall, 1995.

3. Jones, Capers. “Defense SoftwareDevelopment in Evolution.”CrossTalk Nov. 2002.

4. Jones, Capers. Software Assessments,Benchmarks, and Best Practices.Boston, MA: Addison-WesleyLongman, 2000.

5. Jones, Capers. Estimating SoftwareCosts. New York. McGraw Hill,2007.

6. Jones, Capers. Conflict and Liti-gation Between Software Clients andDevelopers. Burlington, MA: Soft-ware Productivity Research (SPR),Inc., 2007.

7. Kan, Stephen H. Metrics and Modelsin Software Quality Engineering. 2nded. Boston, MA: Addison-WesleyLongman 2003.

About the Author

Capers Jones is cur-rently the chairman ofCapers Jones and Asso-ciates, LLC. He is alsothe founder and formerchairman of SPR, where

he holds the title of Chief ScientistEmeritus. He is a well-known author andinternational public speaker, and hasauthored the books “Patterns ofSoftware Systems Failure and Success,”“Applied Software Measurement,” “Soft-ware Quality: Analysis and Guidelinesfor Success,” “Software Cost Esti-mation,” and “Software Assessments,Benchmarks, and Best Practices.” Jonesand his colleagues from SPR have col-lected historical data from more than 600corporations and more than 30 govern-ment organizations. This historical data isa key resource for judging the effective-ness of software process improvementmethods. The total volume of projectsstudied now exceeds 12,000.

Software Productivity Research, LLCPhone: (877) 570-5459Fax: (781) 273-5176E-mail: [email protected],

[email protected]

Page 14: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

14 CROSSTALK The Journal of Defense Software Engineering February 2008

We often hear that CMMI® wasn’t builtfor small companies so it will not work for

them, or some variant of this sentiment.Many people find the CMMI book/tech-nical report intimidating to think aboutusing it. Although it is true that CMMIwas not explicitly built for small compa-nies, it is also true that it was not explic-itly built for large companies [1]. Theexperience we obtained from the CMMIfor Small Business pilot indicates thatCMMI, when applied in a way thatresponds to the business realities of asmall business, can provide small compa-nies with utility.

Small Pilot Company Profiles Two small companies from Huntsville,Alabama were selected to participate inthe pilot:• Analytical Sciences, Incorporated

(ASI) specializes in management andtechnical services with a focus on sys-tems engineering/program manage-ment, information technology, engi-neering and scientific services, andprofessional and organizational devel-opment.

• Cirrus Technologies, Incorporated(CTI) specializes in manufacturingand support services with a focus onlogistics, engineering, manufacturing,test and evaluation, information tech-nology, security, and intelligence.

At the time of the pilot, each companyhad around 200 employees. The projectsselected for the pilot ranged in size from aone-person project to a 22-person project.CMMI v1.1 SE/SW was used as the refer-ence model for the project.

Key Challenges in ProcessImprovement for SmallBusinessWe saw several challenges during theadoption pilot in Huntsville. Some were

challenges that we had hypothesized,some were new insights. Although thepilot was not designed to address all ofthese challenges, we list them here in thefollowing as a reference to underscore thatwe acknowledge that there are a diverseset of challenges for CMMI adoption in asmall setting:• Affordability of process improvement

is a major challenge.

• Small businesses need to realize payoffquickly.

• Small businesses do not have staffdedicated solely to process improve-ment implementation: Customerrequirements take priority and cancause delays.

• There is minimal structure to leveragefrom in a small business.

• The customer rules. Many small organiza-tions adopt/adapt their business prac-tices directly from their customers orprime contractor.

• If a quality system is either not alreadyin place or is not well-functioning,

process definition efforts are muchmore challenging.

• CMMI is generally perceived as intim-idating, both in size and scope.

Motivation for the Pilot The AMRDEC SED is one of three LifeCycle Software Engineering Centers in theArmy. Established in 1984, the SED is arecognized leader in supporting the acqui-sition, research, development, and sustain-ment of some of the nation’s mostsophisticated weapon systems. The mis-sion of the SED is to provide mission crit-ical computer resource expertise to sup-port weapon systems over their life cycle.This mission is carried out by a staff ofapproximately 900 government and con-tractor employees housed in the Army’sonly facility designed specifically for tacti-cal battlefield automated systems support.

Like many federal organizations, theSED relies heavily upon a contract work-force for the fulfillment of its mission.The two primary SED contract vehiclesconsist of many companies categorized assmall businesses. Currently, more than 75percent of the companies contracted forengineering services with the SED aresmall businesses. Since these companiesare increasingly involved in the develop-ment of significant components for soft-ware-intensive systems, their usage of reli-able engineering and management prac-tices has become increasingly critical tothe delivery of quality products for theDepartment of Defense (DoD) warfight-er.

Pilot Process Overview The CMMI for Small Business pilot start-ed in July 2003 and culminated in May2004 with a Standard CMMI-basedAppraisal Method for Process Improve-ment v1.1 (SCAMPISM) Class A appraisalof each of the two pilot companies. Theoverall process is summarized in Figure 1.Gaps between the organizations’ internalprocesses and CMMI were identified by

Is CMMI Useful and Usable in Small Settings?One Example

The Software Engineering Directorate (SED) of the U.S. Army Aviation and Missile Research, Development andEngineering Center (AMRDEC) in Huntsville, Alabama, acquires software-intensive systems and has more than 250small companies in its supply chain. In order to determine the appropriateness of using Capability Maturity Model Integrated(CMMI®) as supplier requirements, members of AMRDEC SED teamed with the Software Engineering Institute (SEI)to perform a technical feasibility study in 2003-2004. This article presents the motivation, the processes used, and the majorresults of the CMMI for Small Business pilot from the perspective of the team that worked on the pilot.

Jacquelyn LanghoutTechnical Management Directorate

Sandra CepedaCepeda Systems and Software Analysis, Inc.

Suzanne GarciaSoftware Engineering Institute

“Ensure that seniormanagement

understands how tointerpret appraisal

results, both in terms ofwhat they are likely to

mean in terms ofperformance and how

they can be appropriatelyused in marketing

concepts.”

SM SCAMPI is a service mark of Carnegie Mellon University.

Page 15: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Is CMMI Useful and Usable in Small Settings? One Example

February 2008 www.stsc.hill.af.mil 15

engaging in a collaborative sessionbetween the pilot team consultants andthe practitioners from the pilot companiesthat was similar to a SCAMPI C appraisal.Based on the results of this analysis, thepilot companies developed and imple-mented an action plan and updated exist-ing processes to close the gaps found.Where necessary, the pilot companies alsodeveloped new processes. Though we ini-tially did not intend to perform SCAMPIA appraisals, the progress made by bothcompanies was such that in January of2004 we defined appraisal scopes in con-junction with the pilot companies, and inMay 2004 we performed SCAMPI v1.1 Aappraisals using the continuous represen-tation of CMMI-SE/SW v1.1 at both sites[2]. Both companies achieved their targetlevel profiles, as follows:• ASI: Capability Level 2 for project

planning, requirements management,and measurement and analysis, andCapability Level 3 for organizationprocess focus and organizational train-ing.

• CTI: Capability Level 1 for projectplanning, requirements management,and project monitoring and control(given some of the other businesschallenges that CTI was facing at thetime of the pilot, establishing Level 1processes in these areas was a signifi-cant achievement).

Lessons Learned From thePilot There are several competencies in processimprovement that provide a useful frame-work for looking at lessons learned fromthe pilot study. Four of these are includedhere as a way to organize lessons learned[3]:• Establishing and sustaining sponsor-

ship.• Developing infrastructure/defining

processes.• Deploying new processes into the

intended use environment.• Managing an appraisal life cycle.

We have included an additional catego-ry of lessons learned in this section:lessons about the CMMI model itself.Those readers who are experienced inprocess improvement consulting in a vari-ety of settings may recognize our primarycompetencies as categories that also applyto larger organizations. However, the par-ticular lessons that have been includedhere are those that we believe are eitherunique to the small settings environmentor are particularly important for a smallcompany to be successful in theirimprovement efforts.

Establishing and SustainingSponsorshipObtaining and sustaining the executivesponsorship necessary to make applyingresources to process improvement activi-ties feasible• Lesson 1: Focus CMMI implementa-

tion in areas where the connectionbetween the model’s content and theChief Executive Officer’s (CEO) busi-ness goals are clearest.

In a small company, sponsorshipoften means getting the attention ofthe owner and/or CEO of the compa-ny. In this setting, the focus of theCEO is often on a combination ofcash flow management and develop-ment of the growth of the company.This implies that any process improve-ment efforts that are presented mustbe aligned with the particular financialenvironment and growth goals of thecompany.

• Lesson 2: Even if you do not havestrong quantitative results right away,make sure that the senior managementgets periodic progress reports thatinclude the qualitative benefits of theimprovement effort.

• Lesson 3: Ensure that senior manage-ment understands how to interpretappraisal results, both in terms of whatthey are likely to mean in terms of per-formance and how they can be appro-priately used in marketing contexts.

Developing Infrastructure/DefiningProcessesProviding enabling infrastructure tomake definition and use of new processeseffectiveExamples of activities that fit in this cate-gory include the following:

° Establishing/managing a pro-cess asset library.

° Establishing/managing a mea-surement repository.

° Establishing/maintaining stan-dards, approaches, and accept-ed methods for writing processguidance.

° Establishing/managing the or-ganization’s curriculum for pro-cess improvement.

° Establishing points of contactor specific groups (e.g., an engi-neering process group [EPG])for various aspects of theimprovement.

• Lesson 4: Even though a formal EPGmay be infeasible for small companies,some focal point for coordination isparticularly needed to coordinateinfrastructure development and sus-tainment.

• Lesson 5: When a well-functioningquality management system is alreadyin place (e.g., based on InternationalOrganization for Standardization[ISO] 9001), take advantage of it! Theexistence of a well-functioning ISO9001-based quality management sys-tem provided a bootstrap for processguidance standards and several otherelements of process improvementinfrastructure. On the other hand, ifthere had been no quality systemalready in place, some time would havebeen needed to establish and set upprocedures for using some kind ofmechanism for storing, controlling,and distributing process assets createdas part of the improvement effort.

• Lesson 6: The tools and practices ofthe accounting system have a greatinfluence on what is considered doable

Site KickoffMeeting

koffng

Gap AnalysisSession

lysison

Action PlanImplementation

Plantation

Execute NewProcesses

Newses

AppraisalPreparation

saltion

Appraise PilotProjects

Close Interaction Between Pilot and Consultants

Figure 1: Pilot Process Overview

Page 16: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

16 CROSSTALK The Journal of Defense Software Engineering February 2008

in terms of collecting and using mea-surements. A small company typicallydoes not have the resources availableto create a parallel metrics collectionsystem from their mainstreamaccounting system, so, at least at thebeginning, what is considered feasiblein terms of measurement is con-strained by what can be collected/aggregated by the tools in use.

Deploying New Processes Into theIntended Use EnvironmentEnsuring that the new CMMI-informedprocesses are available to all relevantusers and that their successful adoption isassociated with appropriate training andjob aids. This is where much of what wetraditionally call organization changemanagement occurs• Lesson 7: Simple CMMI-based

improvements can have a significantimpact in small organizations.

In one case, just adding meetingminutes for the weekly meeting andpublicizing them to the customer andproject participants (not more thanfive people total) contributed to moreefficient monitoring of the project andimproved communication between thecustomer and the project team. Itsounds simple, and it is: The modelprovided an incentive to try somethingso there would be records of deci-sions/status progress. However, theeffect was much greater than the pro-ject participants anticipated, both interms of scope of effect and magni-tude – the change not only providedan effective tool for monitoring but italso resulted in improved communica-tion with the customer, which greatlyimproved the performance of the pro-ject as a whole.

Seeing unanticipated benefitsfrom small changes was a great moti-vator for continuing on the path ofimprovement and being willing, a littlelater in the process, to try largerchanges. In small companies, theeffects of small changes can often beseen much more quickly and the dis-persal of knowledge throughout thecompany about the effects of a changeis also faster.

Managing an Appraisal Life CycleSelecting a method of measuringprogress against a model (i.e., appraisalmethod) and then planning and executingthe tasks associated with the selectedmethod• Lesson 8: Use a focused, collaborative

appraisal method (e.g., SCAMPI B or

C) for the initial gap analysis. Greatbenefit is realized by using this sessionas an opportunity to interpret themodel and gain a better understandingof how it applies to the organization.

• Lesson 9: Ensure someone in theorganization has a good understandingof Appraisal Requirements for CMMIClass A, B, and C appraisal methodsand set expectations [4]. This greatlyincreases the potential for achievingthe appraisal objectives defined by theappraisal sponsor.

• Lesson 10: Collect evidence that willbe useful in the appraisal as you gousing automation support as much aspossible. Interact with the leadappraiser during evidence collectionand mapping to CMMI practices toensure that a complete, well-organizedset of evidence is available for theappraisal. This does not need to bedays and days of billable interaction. Itmay just take the form of e-mailingtemplates for evidence collection toget an idea of how they fit with thelead appraiser’s expectations.

Although this is one of thelessons that is also equally applicable ina larger setting, the effects if this isNOT done are much greater in a smallsetting in terms of the percentage ofstaff time that has to be used torework material that has been preparedfor the appraisal.

• Lesson 11: Introduce generic prac-tices once specific practices are clearlyunderstood but prior to the definitionand documentation of processes.Misinterpretation of generic practices is amajor cause for appraisal failures. This isan area where investing in a smallamount of external consulting couldpay big benefits. In the case of thepilot projects, we held a generic prac-tices workshop to help the pilot par-ticipants get a better understanding ofthe linkages between generic practicesand the process areas they were work-ing with.

• Lesson 12: Quick looks (e.g., SCAMPIB and SCAMPI C) significantlyimprove the chances for achieving theobjectives of a SCAMPI A.

CMMI Model• Lesson 13: Overall, we saw that judi-

cious use of the elements of CMMIthat relate to the business contextprovided a set of useful practicesfrom which small businesses can ben-efit, though not always in predictedways.

• Lesson 14: Using the continuous rep-

resentation of the model allowed thepilots to focus on improvements thatthey perceived as having the highestpayoff for the company.

• Lesson 15: Changing the practices inthe model is not necessary in mostcases; finding alternative practices isoften relevant. In addition, work prod-ucts generated as a result of practiceimplementation rarely match one-to-one to what is suggested in the model.

• Lesson 16: Smallness was not as muchof an issue for model interpretationas the focus of the business.Although both organizations had amore traditional product develop-ment project included in their pilot,they also had more pure service deliv-ery contexts (give me a team of Npeople who can do X for 25 hours permonth for the next six months) thatthey wanted to explore because ser-vice delivery is the heart of their busi-ness. Sometimes those services aredelivered in the context of a project,but they often are not. The model wasmore difficult to interpret in areas ofthe pilot involved in service deliverythan in the small product develop-ment projects. The SEI is involved inan effort led by Northrop Grummanto develop a CMMI for Services(SVC) constellation that may provemore useful in this context.Information on CMMI-SVC can befound on the CMMI Web site at<www.sei.cmu.edu/cmmi>.

A Toolkit to Help You StartYour Own CMMI-BasedImprovement EffortAs a major product of the pilots, the teamproduced a Web-based toolkit that pro-vides details on the processes and assetsused in the pilot. (The draft of the toolkitcan be found at <www.sei.cmu.edu/cmmi/publications/toolkit/>. It is a draftthat was not fully completed due to bud-get constraints. It may get incorporatedinto the Implementing Process Improve-ment in Small Settings [IPSS] Field Guide,in which case it would be updated.) Inaddition to process descriptions, it pro-vides copies of the actual presentations,templates, and other documents used tosupport the pilot. It should be treated asan anecdotal set of assets that might beuseful in supporting a model-basedimprovement effort, rather than a canoni-cal set that defines what should be used.Having said that, we believe that the toolk-it can help people working on improve-ment in the following small settings:

Page 17: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Is CMMI Useful and Usable in Small Settings? One Example

February 2008 www.stsc.hill.af.mil 17

• Focus their improvement efforts.• Figure out how and where to get start-

ed.• Tie their improvements to business

goals.• Educate their staff in areas where they

may need to improve their knowledgeand skills.

• Realize payoffs (mostly qualitative)early in the improvement effort.

• Improve their ability to prepare forappraisals.The feedback to date that we have

received on the toolkit has been very pos-itive and fairly broad in terms of globalaccess (people from Argentina, Israel,United Kingdom, Mexico, and Chile, aswell as the U.S. and Canada have accessedthe toolkit).

In thinking about using the toolkit, wehave a few recommendations for thosewho are working in small settings current-ly and are planning to use it to supportyour improvement effort:• Think of this as one resource to help

you, but not the only one. Everymonth there are new publicationsrelated to CMMI; some of them arelikely to offer different insights thanthe toolkit but they may be valuable toyou.

• Be sure to read the What’s Missing sec-tion of the toolkit to see if any of thethings we talk about in that sectionapply to you. If they do, then youknow you will need resources beyondwhat we used to get you started and besuccessful.

• For those of you who are in the DoDsupply chain, think about getting men-toring from the larger companies thatwork with you and have ongoingimprovement efforts; they should havea vested interest in your success.

• Keep up with the assets in the CMMIadoption area (<www.sei.cmu.edu/cmmi/adoption>); that is where youwill see emerging work on CMMI insmall settings in particular and otherresources that may be of value to you.

• Explore at a reasonable pace. Unlessyou have some business investmentriding on achievement of some partic-ular status related to CMMI, do not tryto do too much at once until you haveestablished what benefits you canaccomplish in your own environment.

Next Steps SED’s Plans for Follow-On ActivitiesThe CMMI small business pilot has beenone of the most beneficial endeavors ofthe SED/SEI strategic partnership. We

are pleased that the AMRDEC SED-sponsored pilot provided the stimulus forthe establishment of the IPSS project atthe SEI. One of the early events of thisproject was an International ResearchWorkshop in this topic area that was heldat the SEI in October 2005 and resultedin an SEI Technical Report summarizingthe workshop and containing the paperssubmitted to the workshop. This report isavailable for download in the publicationssection of the SEI Web site [5].

As the SED/SEI partnership contin-ues, we will start to gain insight into theuse of some other SEI technologies with-in the SED setting. These include theinsertion of Personal Software ProcessSM/Team Software ProcessSM technology in anArmy pilot program to provide the acqui-

sition organization with greater insightinto development metrics. Additionally,the SED/SEI partnership serves an inte-gral role in providing acquisition processimprovement support to many of ourlocal Army program managers.

SEI’s Plans for Supporting CMMI forSmall SettingsThe pilot project in Huntsville, Alabamaemphasized to the SEI the need forappropriate guidance materials for usingCMMI in small settings. In response, theSEI has chartered the IPSS project withinthe International Process ResearchConsortium initiative. Seed funding result-ed in the International Research Work-shop mentioned earlier, and initial spon-sors are supporting the prototyping of anIPSS Field Guide that reflects many of thelessons cited here. Contact CarolineGraettinger, the IPSS project manager, fordetails, at <[email protected]>.

ConclusionWe hope you will find this informationbeneficial as you embark on your ownimprovement journey and you willbecome a member of the burgeoningcommunity of practice for CMMI in smallsettings. Stay tuned with ongoing SEIwork in small settings at <www.sei.cmu.edu/iprc/ipss.html>. This endeavoris discussed more on page 27.u

References1. Chrissis, Mary Beth, Mike Konrad, and

Sandy Shrum. CMMI: Guidelines forProcess Integration and ProductImprovement. Boston, MA: Addison-Wesley, 2003 <www.sei.cmu.edu/cmmi/publicat ions/cmmi-book.html>.

2. Members of the Assessment MethodIntegrated Team. Standard CMMIAppraisal Method for Process Im-provement, Ver. 1.1: Method Defini-tion Document. CMU/SEI-2001-HB-001. Pittsburgh, PA: SEI CMU, 2001<www.sei.cmu.edu/publications/documents/01.reports/01hb001.html>.

3. Garcia, Suzanne, and Richard Turner.CMMI Survival Guide: Just EnoughProcess Improvement. Boston, MA:Addison-Wesley, 2006.

4. CMMI Product Team. AppraisalRequirements for CMMI, Version 1.1(ARC, V1.1). CMU/SEI-2001-TR-034. Pittsburgh, PA: SEI CMU, 2001<www.sei.cmu.edu/publications/ doc-uments/01.reports/01tr034.html>.

5. Garcia, Suzanne, Caroline Graettinger,and Keith Kost. Proc. of the FirstInternational Research Workshop forProcess Improvement in SmallSettings. CMU/SEI-2006-SR-01.Pittsburgh, PA: SEI CMU, 2006.

AcknowledgementsMany people contributed significantresources to this pilot. The CMMI in SmallSettings Toolkit Repository fromAMRDEC SED Pilot Sites Web site, locat-ed at <www.sei.cmu.edu/ttp/publications/toolkit>, contains an acknowledg-ments table that we hope covers most ofthe people to whom we owe gratitude. Wewould, however, particularly like toacknowledge the ASI Team and the CTITeam. Without their dedication, this pilotwould not have been possible, let alonesuccessful. We are also grateful to GeneMiluk, of the SEI, and Mary Jo Staley, ofCSC, who were consultants for the pilotand now have moved on to other endeav-ors. Their ideas and hard work during thepilot made possible much of the learningreflected here.

“For those of you whoare in the DoD supply

chain, think about gettingmentoring from the

larger companies thatwork with you and haveongoing improvement

efforts; they should havea vested interest in

your success.”

Page 18: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

18 CROSSTALK The Journal of Defense Software Engineering February 2008

About the Authors

Suzanne (SuZ) Garciais a senior member ofthe technical staff at theSEI CMU, working in theIntegrating SW-IntensiveSystems group. Her cur-

rent research is focused on synthesizingeffective practices from research andindustry into effective techniques for useby the software and systems engineeringcommunities working in a system of sys-tems context. Garcia worked in theTechnology Transition Practices group,with a particular focus on the technolo-gy adoption issues related to small set-tings. Her early SEI career focused onauthoring, managing, and reviewingCMMs, followed by three years as thedeployments manager for Aimware,Incorporated’s U.S. customers. Garciaalso spent 12 years in multiple improve-ment-related roles at Lockheed Missileand Space Co. She has a bachelor’sdegree and a master’s degree in systemsmanagement.

SEI CMUPittsburgh, PA 15213-3890Phone: (412) 268-9143E-mail: [email protected]

Sandra Cepeda, Presi-dent and CEO of Ce-peda Systems and Soft-ware Analysis, Inc., is aCMMI consultant, anSEI-authorized lead ap-

praiser for SCAMPI, an SEI-authorizedCMMI Instructor, and an SEI VisitingScientist. Her company provides engi-neering services to the Army’s Aviationand Missile Research Development andEngineering Center, SED. Cepeda’s 21years of experience in complex systemsdevelopment has spanned all aspects ofthe life cycle, with a particular focus onVerification and Validation (V&V) andProcess Improvement. She was a mem-ber of the CMMI 1.1 and CMMI 1.2 andSCAMPI 1.2 author teams, and was oneof the lead consultants for the CMMIfor Small Business pilot in Huntsville.Cepeda's has a bachelor’s and master’sdegrees in computer engineering fromAuburn University.

AMRDEC SEDAMSRD-AMR-BA-SQHackberry RDBLDG 6263Redstone Arsenal,AL 35898Phone: (256) 876-0317E-mail: sandra.cepeda@us.

army.mil

Jacquelyn (Jackie)Langhout is currentlythe deputy director ofthe Technical Manage-ment Directorate at theAMRDEC at Redstone

Arsenal, AL. During the time of thepilot, Langhout served as the lead of theEngineering Process Group atAMRDEC’s SED. In that role, Langhoutwas the focal point for all the SED-SEIstrategic partnership efforts as well asoverseeing the SED’s process improve-ment program. Her past assignmentsinclude leading software V&V projects,software development and maintenanceprojects, and providing software supportto program offices. Langhout began hercareer with the Army in 1986 and is amember of the Army Acquisition Corp.She has a bachelor of science in mathe-matics from Samford University, a bach-elor’s degree from Auburn University,and a master’s degree from theUniversity of Alabama in Huntsville.

AMRDEC AMSRD-AMR-TMBLDG 5400 RM S142Redstone Arsenal,AL 35898Phone: (256) 876-4182E-mail: jackie.langhout@us.

army.mil

20th Anniversary IssueAugust 2008

Submission Deadline: March 14, 2008

Application SecuritySeptember 2008

Submission Deadline: April 18, 2008

Development of Safety Critical SystemsOctober 2008

Submission Deadline: May 16, 2008

Please follow the Author Guidelines for CrossTalk, available on the Internetat <www.stsc.hill.af.mil/crosstalk>. We accept article submissions on software-relatedtopics at any time, along with Letters to the Editor and BackTalk. We also provide a

link to each monthly theme, giving greater detail on the types of articles we'relooking for at <www.stsc.hill.af.mil/crosstalk/theme.html>.

CALL FOR ARTICLESIf your experience or research has produced information that could beuseful to others, CrossTalk can get the word out. We are specificallylooking for articles on software-related topics to supplement upcomingtheme issues. Below is the submittal schedule for three areas of emphasiswe are looking for:

Pleat <wwtopics

lin

we are look

Page 19: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

February 2008 www.stsc.hill.af.mil 19

The Intel IT department, like manyother organizations, was seeing issues

with completing software projects on timeand satisfying our customers. Using theSoftware Engineering Institute’sCapability Maturity Model Integration(CMMISM) as a basis to tackle those issues,management challenged us with the goalof being at Maturity Level 3 in 18 months.While we acknowledged that this goal wasoverly ambitious, we still took a shot ataccomplishing the objective. A small teamusing the CMMI as their guide built largeand formal processes and mandated theiruse. The inevitable result was that thisapproach was not well received or imple-mented by the project teams, especiallythose at the smaller end of the scale. TheIT organization also realized that therewas no business justification for a CMMIbenchmark. To address this issue, welaunched a project to streamline theprocesses; we took a different look at theCMMI, listened to the stakeholders, andfocused on our business objectives. Theend result was a set of processes 70 per-cent shorter and much less prescriptive.

Accelerated ProcessImprovement (API)Based on stakeholder feedback, auditresults, and process coach input, we iden-tified a number of the work processesthat were complex and difficult to use.The organization gathered 35 representa-tives including key stakeholders, engi-neers, process coaches, and auditors foran intense three-and-a-half day face-to-face meeting. In preparation for the ses-sion we mapped business and modelrequirements to our existing processes inorder to determine what portions were

eligible for simplification. In addition, wecreated a priority list of feedback fromaudit, coaching, and pending processimprovement requests to determine toppain points and focus areas. Since we hadlimited face-to-face time to complete thework, we were willing to accept less thanperfect results and less adherence to themodel to streamline the business process.

A key component of the successfulcollaboration was sequestering the teamat an off-site facility to maximize focusand support a collaborative environment.The agenda for this event was driven bythe stakeholders rather than the processengineers. The stakeholders drove thevast majority of the changes with a pri-mary focus on accomplishing businessresults rather than textbook model com-pliance. The team reviewed each workprocess against the business require-ments and the CMMI model to identifyportions of the process that could beconsidered for deletion or simplification.The group was broken down in to threesub-teams each addressing a differentprocess area. Daily report out sessionsand synchronization sessions helpedensure consistency between the teams.Our CMMI model expert floatedbetween the teams to answer model relat-ed questions.

One of the themes we noticed wasthat the initial process development teamhad implemented many of the sub-prac-tices in the CMMI model believing theywere a required model component. Forexample, based on sub-practices from theconfiguration management process area,we required each document to haveunique identifier labels both internal tothe document and in the file name – aconvention that was unnecessarily com-plex and added little value for our typical8-to-12 person project teams. We alsodefined and mandated specific elicitationtechniques to meet the intent of the sub-practices in the requirements develop-ment process area; these techniques wereoverkill for our small project teams which

worked closely with their customers andstakeholders. Finally, we eliminated therequirement for a separate formal com-mitment for resources by combining thiswith one of the key milestone decisions.This came about by an incorrect interpre-tation of the sub-practices in the projectmonitoring and control process area.

Figure 1 (see page 20) shows the oldprocesses for scope, schedule, andresource management. These separateprocesses totaled 29 pages and 17process steps. The new process was ableto combine scope, schedule, and resourcemanagement into one process documentthat was five pages long with eightprocess steps. This was accomplished bysimplifying the wording, combining oreliminating process steps, and removinginstructional text which we thoughtwould substitute for skills expertise fromthe process documents.

Overall, we achieved a 70 percentreduction in document length, with oneprocess shrinking to just 17 percent of itsoriginal size. Detail of other process andtemplate reductions can be seen in Table 1(see page 20). The original and combinedprocesses can be accessed via the onlineversion of this article.

When we evaluated the outcome ofthe effort, one of our project managerssaid the following:

The API face-to-face session wasgreat for analyzing the weak spots(in the documentation and in theprocesses), as well as acknowledg-ing the flaws of the previousrelease. The teams addressed theissues with great teamwork thatallowed for development of somecreative and tangible action plans.

In the end, we reduced the workload ofour processes and improved documenta-tion so that it would more effectively deliv-er key information. The result of processsimplification was the ability of smallerprojects, originally deemed out of scope,

Why Do I Need All That Process?I’m Only a Small Project©

At Intel’s Information Technology (IT) department, we developed extensive processes for our projects. While the large projectsget the glory, the majority of our projects are less than six months long, have small teams, limited scope, and low risk. Wefound that we have a variety of project sizes but a single set of processes originally built for larger projects. So how did wefix that issue?

Mark Brodnik, Robyn Plouse, and Terry LeipIntel Corporation

TM Intel, the Intel logo, Intel. leap ahead., and Intel. Leapahead. logo are trademarks or registered trademarks ofIntel Corporation or its subsidiaries in the United Statesand other countries.

* Other names and brands may be claimed as the propertyof others.

© 2007 Intel Corporation. All rights reserved.

This article is for informational purposes only. INTELMAKES NO WARRANTIES, EXPRESS OR IMPLIED, INTHIS ARTICLE.

Page 20: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

20 CROSSTALK The Journal of Defense Software Engineering February 2008

to adopt the processes. Our number of eli-gible projects went from 36 percent to 50percent. Our process coaches could nowsupport an average of 18 projects percoach, up from 14 projects per coachbefore the simplification. In addition, thetraining course delivery was streamlined tohalf of their original duration.

What About Very SmallProjects?As senior management started seeing thebenefits of standard processes, they madethe decision to require all projects greaterthan eight weeks in duration to adopt theprocesses. As the adoption rates for theseprojects came closer to 100 percent, webegan to look for ways to help the pro-jects originally deemed too small for ourprocesses. These very small projects, lessthan eight weeks in duration, were con-sidered low risk and therefore were notrequired to follow any formal processesor standards, although some did have

their own self-imposed methods of run-ning their projects. A result of this wasthat management did not have sufficientvisibility into these projects and some ofthe larger projects would attempt tochunk their work into less than eightweeks so they could avoid the fullprocess and external visibility into theirwork.

In our discussions with project man-agers, we also realized that the eight-weekrule was too limiting, since many projectswere only slightly longer (8 to 12 weeks),but had just two or three fractionalresources. These project managers feltthat the overhead of the processes werestill too large compared to the amount ofoverall effort of the project, and theywanted to also take advantage of a small-er set of processes. They argued that theyspent most of their time filling out formsor compliance checklists rather thanfocusing on the task at hand. We also dis-covered through audits that project man-agers in this class of projects would often

run their projects as usual and then goback after the fact and create the requiredprocess collateral. Clearly, there was needto provide processes that addressed verysmall projects.

Proof of Concept (PoC)Before we deployed a new set ofprocesses across all of IT, we decidedthe best approach was to work with alimited set of these very small projectsthrough a POC in one or two IT groups.We would then analyze the data to deter-mine if an IT-wide deployment was war-ranted or if we needed additional pilots.Finally we planned to take a thoroughlook at our entire process library toimplement a more robust project charac-terization approach that right-sizes theprocesses and tools used by our projectteams.

The first area we looked at was howour process deliverables were organized.Our projects’ process deliverables aredefined at two main levels. The first levelis what a project needs to review at amilestone decision meeting with man-agement (ex. requirements, design, andschedule). The second level is deliver-ables that are typically produced as aresult of project work (e.g., various sup-porting plans, internal compliancescorecards, review and approval ofrecords, change records). In order toaddress the issue of excessive processoverhead for small projects, the

Document Old Size New Size

High-Level Requirements Process 9 pages – 7 steps 4 pages – 5 stepsDetailed Requirements Process 11 pages – 6 steps 4 pages – 5 stepsHigh-Level Requirements Template 10 pages 2 pagesDetailed Requirements Template 14 pages 6 pagesScope/Schedule/Resource Process 29 pages – 17 steps 5 pages – 8 stepsProject Plan Plus Supporting Documents 4 documents 1 spreadsheetBaseline Change Control Process 6 pages – 13 steps 5 pages – 9 steps

Original Flows for Scope, Schedule, and Resource Processes

Combined and Simplified:Scope, Schedule, and Resource Processes

Table 1: Old Versus New Document Sizes

Document Old Size New Size

High-Level Requirements Process 9 pages – 7 steps 4 pages – 5 stepsDetailed Requirements Process 11 pages – 6 steps 4 pages – 5 stepsHigh-Level Requirements Template 10 pages 2 pagesDetailed Requirements Template 14 pages 6 pagesScope/Schedule/Resource Process 29 pages – 17 steps 5 pages – 8 stepsProject Plan Plus Supporting Documents 4 documents 1 spreadsheetBaseline Change Control Process 6 pages – 13 steps 5 pages – 9 steps

Original Flows for Scope, Schedule, and Resource Processes

Combined and Simplified:Scope, Schedule, and Resource Processes

Figure 1: Example Simplification and Reduction

Page 21: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Why Do I Need All That Process? I’m only A Small Project

February 2008 www.stsc.hill.af.mil 21

approach to the PoC was to determinethe major process deliverables that aproject should produce and require onlythose deliverables. To speed up the PoC,we started with the first-level deliver-ables and decided to leave the creationof second level process deliverables upto the discretion of the project manager.Ultimately, the goal was to have projectmanagers focus more on the productdeliverables and less on the processdeliverables.

The next item we looked at was com-bining the templates for the variousprocess deliverables into a single, simpli-fied template that contains the basicinformation the projects are required tocomplete. This allowed decision makersto review this document at milestonedecisions rather than requiring the pro-ject manager to duplicate information ina separate, formal presentation.

The last major challenge we wantedto work through in the PoC was the cut-off point between these very small pro-jects and our larger projects. In the end,we decided that a more pragmaticapproach was to look at several attribut-es to consider for the cutoff such asteam size (one to three team members),overall effort (around 500 hours), andproject risk (no financial systems or soft-ware exchange impact) to provide amore balanced set of classifications.

ResultsWhile the model is not our primary focus,we are still measuring ourselves against itsrequirements to make sure we have notmissed a best practice. Through a series ofStandard CMMI Assessment Method forProcess ImprovementSM (SCAMPI) Cs, Bs,and As performed over the past threeyears we are seeing increased compliancewith the model. We have been measuringprogress toward Capability Level 3 inConfiguration Management, ProjectPlanning, Project Management andControl, Requirements Management, andRequirements Development, and we arealso beginning to measure compliance forMeasurement and Analysis, Process andProduct Quality Assurance, and Verifica-tion and Validation.

From a business perspective, we areseeing positive results in shortening theoverall duration of our projects andimproving our ability to meet our com-mitted delivery dates. This is done bypaying particular attention to the plannedversus actual time in each individualphase.

In addition, our quality assuranceaudit team discovered that process com-pliance had improved as a result of thesimplification. Since the processes hadchanged so dramatically, it was not possi-ble to do an apples-to-apples comparisonof the audit results prior to simplifica-

tion, but both auditors and project teamfeedback indicated that compliance wasbetter and easier to achieve.

SummaryIn our process engineering journey welearned the following things:• First and foremost, we need to get our

stakeholders involved in every step ofthe development process. This ensuresquicker acceptance by project teamsand uncovers usability problems earli-er in the process.

• Second, it is essential to have a busi-ness focus rather than a model focus.It is easy to lose sight of businessobjectives when trying to comply withCMMI requirements.

• Third, we should start by buildingprocesses as small as possible and addto them only as needed. We learnedthat by building large processes first,the true cost extends beyond thedevelopment and requires significanteffort to support and maintain.

• Finally, time is the most precious com-modity in process improvement.Getting a quick win that the organiza-tion would accept is more importantthan perfection.In the end, it is all about being both

efficient and effective.u

About the Authors

Robyn Plouse, PMP, is aSCAMPI lead appraiserand a quality auditor atIntel’s IT group whereshe also leads the CMMICommunity of Practice.

She has 10 years of project managementand quality experience with Intel andnine years of experience as a systemsengineer at Lockheed Martin. She holdsa bachelor of science degree in aero-space engineering from San Diego StateUniversity and a masters degree in engi-neering management from Santa ClaraUniversity.

Intel CorporationM/S RR5-5051600 Rio Rancho BLVD S.E.Rio Rancho, NM 87124Phone: (505) 893-6319E-mail: [email protected]

Mark Brodnik, ProjectManagement Profes-sional (PMP), is a projectmanager at Intel’s ITgroup where he special-izes in process develop-

ment and process improvement. He has15 years of IT software applicationsdevelopment and management experi-ence and holds a bachelor of sciencedegree in management information sys-tems from the University of Arizona.

Intel CorporationM/S C11-1236505 W Chandler BLVDChandler, AZ 85226Phone: (480) 552-7028E-mail: [email protected]

Terry Leip is the leadquality assurance auditorat Intel’s IT group. Hehas more than 20 yearsof software developmentand quality experience

with Intel and Lockheed Martin andholds a bachelor of science degree inbiology and computer science fromGrand Canyon University.

Intel CorporationM/S C11-1236505 W Chandler BLVDChandler, AZ 85226Phone: (480) 552-2079E-mail: [email protected]

Page 22: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

22 CROSSTALK The Journal of Defense Software Engineering February 2008

The Raytheon Network Centric Systems(RNCS) has five major engineering

centers across the country. An interestingfootnote is these five major engineeringsites had five completely different processdefinitions because before they were partof a common business unit, they wereactually competing companies. TheNortheast location in Marlborough,Massachusetts where I work, was part ofthe original Raytheon. The St. Petersburg,Florida, location was originally part of E-Systems. The Fullerton, California, loca-tion was part of Hughes Defense. TheFort Wayne, Indiana, location was also partof Hughes, but a different internal organi-zation. The McKinney, Texas, location waspart of Texas Instruments. Each had theirown organizational process definition andin the past each had conducted separate

Capability Maturity Model evaluations andCMMI appraisals ranging from maturitylevels 3 to 5 and for a variety of disciplinecombinations: software or software-sys-tems.

Raytheon has defined and implement-ed a company-wide Integrated ProductDevelopment System (IPDS) that pro-vides a program-level process for allRaytheon programs. IPDS provides anextensive process definition describingwhat should be done throughout a pro-gram; the organizations managing pro-grams had to define processes on howIPDS was going to be implemented intheir programs. These organizationalprocesses were the ones the RNCS orga-nizations used in their Standard CapabilityEvaluation (SCE) or SCAMPIs and, ofcourse, they were all different. Then the

enlightened senior management of RNCSmade a crucial business decision that allfive sites were going to have a commonengineering implementation of IPDS withthe ultimate goal of having the capabilityto move work across the five sites. Thefive sites endeavored over a couple ofyears to define, deploy, and implement theRNCS Common Process Architecture(CPA). The effort ultimately led to the fivemajor engineering centers successfullyconducting our SCAMPI that resulted in aMaturity Level 5 rating reported to theSoftware Engineering Institute (SEI) asRaytheon Network Centric Systems (Engineering– software engineering, software, hardware[SE/SW/HW]) projects with engineering devel-opment in the scope of the project [2].

We included hardware amplificationsin order to include in the appraisal ourhardware engineering discipline. In addi-tion to – and key to – our success was ourinternal program engineering discipline.The key to our successful SCAMPI wasthe definition and deployment of theorganizational CPA across the five RNCSsites. Let us examine this CPA organiza-tional process definition.

CPA The design of the CPA followed the typ-ical life cycle that our programs follow.We initially started creating CPA with arequirements definition phase where weused our Raytheon IPDS, the CMMImodel, and International Organizationfor Standardization (ISO) 9000, to pro-vide the requirements for the CPA. As wedefined our CPA, the identified require-ments were entered in our requirementsdatabase and then were flowed down toorganizational behavior activities. I amnot going into the details of the overallCPA process, which is not the focus ofthis article. However, to make a very longstory short, the main artifacts of the CPAdefinition were a series of CPA Work

Small Project Survival Among the CMMI Level 5 Big Processes

In the past several years, many engineering firms have stepped up and achieved Capability Maturity Model Integrated(CMMI) Level 5 [1]. Currently, 13.9 percent of the 2,140 appraised organizations have achieved CMMI Level 5 usingthe Standard CMMI Appraisal Method for Process Improvement (SCAMPI). To achieve this maturity level, the organi-zation must implement a rigorous process definition to address the CMMI model through maturity Level 5. Once defined,the organizational process must be implemented throughout the organization even by small projects. I will examine how, inmy business unit, we have successfully achieved CMMI Level 5 across five geographically dispersed business units and howsmall projects have survived within our significantly large process definition.

Alan C. JostRaytheon

Number U0141CRPWork InstructionRevision B

Supersedes ADetailed PlanningDate 01/22/07

Introduction The purpose of the Detailed Planning WI is to direct a program’s detailed engineeringplanning activities based on the initial planning outputs and the scope of theawarded contract as part of preparing for the program’s Gate 5. During the detailedplanning activities those work products from the initial planning activities are reviewed,and updated or expanded as necessary. Any planning work products that may havenot been produced as a result of the initial planning activities, but are determined to berequired after contract negotiations will need to be created as part of the detailedplanning activities.

The complete set of detailed planning activities is defined by this WI and thesupporting Plan Generation Work Instructions for the various program plans.Refer to Appendix A for an overview of the plans hierarchy. A program can use theNCS Engineering process to tailor the engineering plans identified in Appendix Abased on the needs of the program. The program plans should be put underconfiguration control per the NCS Work Product Management Work Instruction.

Input(s) Contract Documents Engineering Cost Estimates IPDS Tailoring NCS Engineering Tailoring

Program Organization Structure Initial Engineering Program PlansLessons Learned Repository Process Assets Library

Work Breakdown Structure (WBS)Engineering Budget

ess Summary ID Steps at a Glance Responsible Role

Figure 1: Introduction to WI1

Page 23: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Project Survival Among the CMMI Level 5 Big Processes

February 2008 www.stsc.hill.af.mil 23

Instructions (WI). There are 88 WIs to beexact. Each of these has associated withit enablers to assist in process implemen-tation, such as the following: softwaretools, templates for required documents,process checklists, and other enablers tobe used by all programs at the five engi-neering sites. Naturally, the CPA organi-zational process definition is substantialin detail and volume, but one of themajor cost advantages is that while it con-sists of a large amount of documenta-tion, it pales in size to having five differ-ent sites each with their own process def-inition, set of tools, templates, andenablers. It was common and had to beused by all programs. The CPA has to berobust enough to support programs withengineering staffs of more than 100 engi-neers such as the next generationdestroyer for the Navy or the replace-ment of all the Air Traffic Control sys-tems in the United States. Well, CPA isrobust enough not only to support thesesubstantial programs being implementedacross the five engineering sites, but alsosupports CMMI Maturity Level 5processes.

What about the programs that do nothave a marching army of engineers withbudgets in the millions and billions thatare executed over multiple years? How dothese small programs deal with the sub-stantial organizational process that wasformally appraised at CMMI Level 5 andnow required by all RNCS programs? Itcan be wrapped up in one word: tailoring.Tailoring is a key organizational processthat is used by large, medium, and smallprograms. All programs can eliminateWIs and can accept, reject, or modify theWI requirements as well as modify thecontents of the plans as needed by eachprogram’s contracts; it is the extent anddepth of the tailoring that allows thesmall programs to survive. However,before getting into tailoring, it is impor-tant to understand the general structureof the CPA WIs.

CPA WI Structure andTailoringEvery WI has the same format structure(see Figures 1 through 4, each one will bedetailed in this section).

Figure 1 is the first part of the CPAWI that provides an introduction to theWI process, WI name and configurationmanagement information of the WI. Italso provides the input necessary for theprocess. This particular WI is the mainWI used by the typical large program toperform the detailed planning of the

program during the program start-upphase that occurs right after contractaward. The Detailed Planning WI definesall the steps that the program manage-ment team accomplishes in the approxi-mately 45 days after the contract issigned. In actuality, some of the DetailedPlanning WI steps require updates todocuments and other artifacts previouslydrafted during the Initial Planning WIsteps that were accomplished during theproposal stage of the project. From theInitial Planning WI there are several out-puts that are also inputs to the DetailedPlanning WI, for example: InitialEngineering Program Plans and WorkBreakdown Structure (WBS). Typically,several of the Initial EngineeringProgram Plans (i.e., SoftwareDevelopment Plan [SDP], SystemEngineering Plan [SEP]) are drafted dur-ing the proposal stage and in the begin-ning of the program after contract awardduring program start-up, they arereviewed and finalized. Likewise, the

WBS is generated during the proposaltime and may have to be adjusteddepending on the results of contractnegotiations. A portion of the DetailedPlanning WI is shown in Figure 2.

In Figure 2, you see the first threerequirements or steps at a glance in the 33steps in the detailed planning process.Interestingly, these Steps at a Glancewere flowed-down as requirements fromIPDS, CMMI, and ISO9001 into thedetailed planning behavior process(Figure 3).

Figure 3 shows the expected outputsfrom when the CPA Detailed PlanningWI implemented the detailed planningprocess, which included multiple engi-neering plans and the WBS. These areartifacts produced or updated as thedetailed process is implemented. Most ofthe plans generated from this workinstruction are updates to those draftedin the initial planning and proposalphase. Each of the plans is enabledthrough the use of templates with

Revision B

Supersedes ADetailed PlanningDate 01/

Introduction The purpose of the Detailed Planning WI is to direct a program’s detailed engplanning activities based on the initial planning outputs and the scope of theawarded contract as part of preparing for the program’s Gate 5. During the dplanning activities those work products from the initial planning activities are rand updated or expanded as necessary. Any planning work products that manot been produced as a result of the initial planning activities, but are determrequired after contract negotiations will need to be created as part of the deplanning activities.

The complete set of detailed planning activities is defined by this WI and thesupporting Plan Generation Work Instructions for the various program plans.Refer to Appendix A for an overview of the plans hierarchy. A program canNCS Engineering process to tailor the engineering plans identified in Appenbased on the needs of the program. The program plans should be put undeconfiguration control per the NCS Work Product Management Work Instructio

Input(s) Contract Documents Engineering Cost Estimates IPDS Tailoring NCS Engineering Tailoring

Program Organization Structure Initial Engineering Program PlansLessons Learned Repository Process Assets Library

Work Breakdown Structure (WBS)Engineering Budget

Process Summary ID Steps at a Glance Re

_ 1 Define Planning Strategy for Detailed Planning Pro

_ 2 Update the SEP, HDP, SDP, and QPP Pro

_ 3 Update Program’s Life Cycle Model

Output(s) Earned Value Management System (EVMS) baseline Engineering Cost Estimates Engineering Decision Analysis and Resolution Plan Engineering Measurement, Analysis and Improvement Plan

Engineering Training Plan Work Project Management Plan (WPMP)Gate Review Plan Hardware Development Plan

Integrated Master PlanIntegrated Master Schedule Integration, Verification and Validation Plan IPDS Tailoring

NCS CPA Tailoring Planning StrategyProgram Organization Structure Program’s Security Plan

Quality Program Plan Risk and Opportunity Management Plan Software Development Plan Systems Engineering Plan

WBS

Note: SEP – System Engineering Plan, HDP – Hardware Development Plan, SDP – Software Development Plan, and QPP – Quality PrograFigure 3: Outputs From the Detailed Planning Process3

Lessons Learned Repository Process Assets Library

Work Breakdown Structure (WBS)Engineering Budget

Process Summary ID Steps at a Glance Responsible Role

_ 1 Define Planning Strategy for Detailed Planning Program EngineerProgram PlanningTeam

_ 2 Update the SEP, HDP, SDP, and QPP Program EngineerProgram PlanningTeam

_ 3 Update Program’s Life Cycle Model Program PlanningTeam

Output(s) Earned Value Management System (EVMS) baseline Engineering Cost Estimates Engineering Decision Analysis and Resolution Plan Engineering Measurement, Analysis and Improvement Plan

Engineering Training Plan Work Project Management Plan (WPMP)Gate Review Plan Hardware Development Plan

Integrated Master PlanIntegrated Master Schedule Integration, Verification and Validation Plan IPDS Tailoring

NCS CPA Tailoring Planning StrategyProgram Organization Structure Program’s Security Plan

Quality Program Plan Risk and Opportunity Management Plan Software Development Plan Systems Engineering Plan

WBS

Note: SEP – System Engineering Plan, HDP – Hardware Development Plan, SDP – Software Development Plan, and QPP – Quality Program Plan.

Figure 2: Detailed Planning Steps at a Glance2

Page 24: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

24 CROSSTALK The Journal of Defense Software Engineering February 2008

embedded instructions on what informa-tion should be included in the plans (seeFigure 4).

Figure 4 shows one of 33 Steps at aGlance in the Detailed Planning WI. Itshows how each step is expanded into theCPA process requirement under theRequirement heading along with its associat-ed Guidance on how to implement theStep at a Glance CPA requirement.Naturally, there is a WI for tailoring theWIs, which in itself can also be tailored(see Figure 5). Along with the Tailoringthe CPA Process WI, the organization hasprovided a Guidance enabler on tailoringthat includes guidance for tailoring smallprograms as well.

Figure 5 shows one of the main stepsin the tailoring process – this is actuallystep 8 of 12 steps (see Step 8 in Figure 6).

As you can see, most of the WI

describes how the tailoring process is con-ducted (Steps 1-7 and 9-11 in Figure 6),how the tailoring decisions are stored iniPlan (Step 8), and how the tailoring isapproved by the stakeholders and finallyapproved by the ultimate stakeholder(Step 12), the Engineering Process Group(EPG). So, a program cannot go wild andtailor out important portions of the CPA.Tailoring is a key concept in the CMMImodel, and it is a critical task for programsusing our CPA, and critical to survival forsmall programs using CPA. The modelrecognizes that many factors impact theprocess implementation on a particularprogram, for example:• Program size.• Program complexity.• Contractual requirements.• Customer deliverable requirements

and format requirements.

• Specific measurements required by theorganization and the customer.As far as our CPA tailoring, each of

the WI requirements can be accepted(i.e., accepted as written), rejected (i.e.,deleted from the work instruction) ormodified (i.e., rewritten, such as SDP willbe written to customer format versusCPA format). The Guidance for each ofthe CPA Requirements can be adjustedfor the specifics of the program and con-tractual requirements, the output docu-ments can be rejected or modified tomeet the contractual requirements, andthey can be adjusted for the program size.All tailoring is captured in the iPlan toolas modified WIs for the specific pro-gram. The iPlan tool provides a ProgramTailoring Workspace – a virtual libraryfor the program’s approved process defi-nition. The iPlan tool captures all the tai-loring done to each of the CPA WIs, theEPG approved justification for all the tai-loring, and the actual modified WIs forreference by program engineers duringimplementation. I must emphasize again:that all tailoring is approved by the EPGbefore the program’s tailored process isimplemented. For larger programs withextended schedules, the tailoring of theCPA process can be done incrementallyby program phase so you do not have todo tailoring of the system test process ifthat process is not needed for severalyears. This process flexibility is expectedin the CMMI and is an integral part ofthe RNCS CPA tailoring process. It isespecially important for the small pro-grams that have to survive among the bigprogram processes, which leads us tosmall project tailoring.

Small Project TailoringOne of the first steps, of course, is tounderstand what it means to be a smallproject. In the beginning of CPA, it wasmore a concept of what a small programconsisted of, i.e., less than a year dura-tion, small budget, small team, etc. Smallprograms are now formally defined asthose with budgets between $500,000and $4 million, less than a year in dura-tion, and/or a team of eight or less engi-neers from all disciplines. We now have adefinition of a small program which ishelpful when we enter the tailoringprocess. The tailoring process is formal-ized during the start of the program.Now one of the major structural aspectsof Raytheon’s IPDS and reinforced inRNCS CPA is the program gatingprocess. Formal gates are reviewed atspecific engineering phases: programstart-up (gate 5), system requirements

Requirement Update and document the development life-cycle model that will be used.

1. Update Program’s Life Cycle Model Guidance

1. Review and update as necessary the program’s life cycle phases on which to scope the detailed planning effort. Refer to the Life Cycle Models in theRaytheon Process Assets Library (RayPAL), for descriptions of candidate models. See References section for further information on accessing these models in RayPAL.

2. Document the life cycle phases in the applicable program plans such as theSEP, SDP, or Hardware Development Plan (HDP).

3. Update the systems engineering capabilities expected at the end of each applicable life cycle phase in the SEP.

4. Update the software engineering capabilities expected at the end of each applicable life cycle phase in the SDP.

5. Update the hardware engineering capabilities expected at the end of each applicable life cycle phase in the HDP.

Requirement Tailor the CPA Work Instruction Requirements (WIRs) for each WI.

8. Make TailoringDecisions

Guidance 1. For each work instruction provided by the iPlan tool, the tailoring team

evaluates the work instruction requirements to determine the programtailoring decisions. WI Requirement decisions are the following:

a. Accepted b. Rejected c. Modified The Tailoring Decisions and Work Instruction tab in the Tailoring Workspace of the iPlan tool provides visibility into the WIRs for each WI and allows theprogram to select or deselect specific WIRs and provides a utilizationsummary for each WI. The Library Utilization section of the Work Instructiontab is used to record the programs tailoring decisions with regard to aspecific set of WIRs. Work Instruction Utilization Acceptance is automaticallydetermined by iPlan based on the following:

a. Accept – all WIRs accepted.b. Modified – one or more (but not all)WIRs rejected c. Reject – all WIRs rejected.

2. This process is repeated for all work instructions provided for this tailoringworkspace. Note that the tailoring workspace may be different for each discipline and for the program level (multi -discipline).

3. The tailoring team consults with the EPG representative or iPlan SME toevaluate the effect on higher requirements when WIRs have not beenaccepted. Although tailoring is not to be considered something negativeand programs should do what is smart for their business, the organizationhas an interest in understanding how and why programs are deviatingfrom the organizational standards. Keeping with this notion, when a programtailors or rejects WIRs, the Notes field for that set of WIRs must be filled into address ‘why’ the program is deviating from the organizational standard.

Figure 5: Main Step in Tailoring Work Instruction5

Requirement Update and document the development life-cycle model that will be used.

1. Update Program’s Life Cycle Model Guidance

1. Review and update as necessary the program’s life cycle phases on which to scope the detailed planning effort. Refer to the Life Cycle Models in theRaytheon Process Assets Library (RayPAL), for descriptions of candidate models. See References section for further information on accessing these models in RayPAL.

2. Document the life cycle phases in the applicable program plans such as theSEP, SDP, or Hardware Development Plan (HDP).

3. Update the systems engineering capabilities expected at the end of each applicable life cycle phase in the SEP.

4. Update the software engineering capabilities expected at the end of each applicable life cycle phase in the SDP.

5. Update the hardware engineering capabilities expected at the end of each applicable life cycle phase in the HDP.

Requirement Tailor the CPA Work Instruction Requirements (WIRs) for each WI.

8. Make TailoringDecisions

Guidance 1. For each work instruction provided by the iPlan tool, the tailoring team

evaluates the work instruction requirements to determine the programtailoring decisions. WI Requirement decisions are the following:

a. Accepted b. Rejected c. Modified The Tailoring Decisions and Work Instruction tab in the Tailoring Workspace of the iPlan tool provides visibility into the WIRs for each WI and allows theprogram to select or deselect specific WIRs and provides a utilizationsummary for each WI. The Library Utilization section of the Work Instructiontab is used to record the programs tailoring decisions with regard to aspecific set of WIRs. Work Instruction Utilization Acceptance is automaticallydetermined by iPlan based on the following:

a. Accept – all WIRs accepted.b. Modified – one or more (but not all)WIRs rejected c. Reject – all WIRs rejected.

2. This process is repeated for all work instructions provided for this tailoringworkspace. Note that the tailoring workspace may be different for each discipline and for the program level (multi -discipline).

3. The tailoring team consults with the EPG representative or iPlan SME toevaluate the effect on higher requirements when WIRs have not beenaccepted. Although tailoring is not to be considered something negativeand programs should do what is smart for their business, the organizationhas an interest in understanding how and why programs are deviatingfrom the organizational standards. Keeping with this notion, when a programtailors or rejects WIRs, the Notes field for that set of WIRs must be filled into address ‘why’ the program is deviating from the organizational standard.

Figure 4: Step 3 of Detailed Planning Work Instruction4

Page 25: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Project Survival Among the CMMI Level 5 Big Processes

February 2008 www.stsc.hill.af.mil 25

review (gate 6), preliminary design (gate7), detailed design (gate 8), formal testing(gate 9), etc. As a matter of fact, we have11 formal gates, four before contractaward and seven gates after contractaward. The important one for small pro-ject tailoring is our gate 5, known as pro-gram start-up. Every program right aftercontract award has 45 days to completethe program start-up process and reviewthe overall program approach with thevarious senior management levels associ-ated with the execution of the program.Key to this start-up process is the tailor-ing of the CPA WI to describe how eachWI is going to be implemented on theprogram. As previously stated, in the pro-posal phase, several of the engineeringplans may have been written in draftform or written as part of the proposal.After contract award, during the programstart-up phase, the various engineeringplans are either going to be updated fromthe drafts written in the proposal phaseor written in the start-up phase. The engi-neering plans consist of the following:• SEP.• SDP.• HDP.• Stakeholder Involvement Plan (SIP).• Measurement Analysis and Improve-

ment Plan (MAIP).• Integration, Verification and Vali-

dation Plan (IVVP).• Quality Program Plan (QPP).• WPMP and Work Product List (WPL).• Several others.

As you can imagine, this can be anintensive period for the managers involvedwith the program. The small program tai-loring of the CPA WIs is a heavy dutyeffort. Just think of trying to pare down88 WIs in order to describe how you aregoing to implement the program usingthese tailored CPA WIs. The first group ofsmall programs that pioneered their waythrough this tailoring process and tailoringjustifications were reused as the variousplans were adjusted and reused. Well, itdid not take long to determine that anorganizational process improvement wasnecessary to assist these small programsthrough the program start-up and tailor-ing processes: enter the Small ProgramVariant (SPV) Planning WI.

SPV PlanningThe RNCS CPA Engineering Councilscollaborated and generated a new WI tobe used by small programs to do theirprogram planning called the SPVPlanning WI. It essentially provided pre-tailoring of the CPA WIs as a startingpoint. One of the major steps in the SPV

planning was the creation of the SmallProgram Engineering Plan (SPEP) tem-plate. The SPEP template combined themajor portions of the SEP, SDP, HDP,and IVVP into the single SPEP.Furthermore, additional planning activi-ties associated with large programs werereduced for the small programs. Forexample, the amount of measurementdata is reduced, such as staffing, since it iseasy to keep track of three or four people.Typically, small programs deal with onlyone or maybe two of the engineering dis-ciplines, so specific discipline WIs can beeliminated, for example: A software-only

job can eliminate all the WIs associatedwith Hardware Preliminary Design,Hardware Detailed Design, HardwareTesting, etc. The details of stakeholderinvolvement can be reduced on smallteams. Details such as formal weekly teammeetings can be eliminated especiallywhen the team is two to three people allworking in the same cubicle. Formalreviews are reduced to three to four pagereview packages versus 35 to 40 pagedetailed, senior manager review packages.

Small programs still have to deal with 78WIs and the tailoring of those, but withthe creation of the SPEP a significantwork load was removed from the pro-grams. There are, however, in the SPVPlanning WI some of the other engineer-ing plans, even for the small program,that have to be created and implementedsuch as the following: the Risk andOpportunity Management Plan, theMAIP, the WPMP, WPL, and the SIP. Onmy small programs, I have even been ableto convince the EPG that some of theserequired standalones could be down-scoped and incorporated into my pro-gram’s single SPEP, thus reducing thecoordination and sign-off cycle of multi-ple, individual plans; it was done usingjust my one SPEP.

Other allowable tailoring is the com-bination of engineering phases. Typically,small programs are associated withextending the functionality of productlines where the preliminary design of thefunctionality is already known. So, smallprograms can tailor the conduct of twoof the engineering gates reviews into one:the preliminary design and detaileddesign phases and the associated gates 7and 8 (Preliminary Design Review [PDR]and Critical Design Review [CDR]). Evenmore typical, the system requirements areusually small changes to the product lineand the System Functional RequirementsReview (gate 6) can also be rolled into thecombined PDR/CDR. This is all depen-dent upon the tailoring and permissiongiven by the EPG during the EPG’s tai-loring approval meeting.

Implementation of SmallProgramsWith the SPV Planning WI, how does theimplementation of small programs differfrom the standard programs? To start, thenumber of WIs to tailor is reduced to 78WI for SPV down from the 88 used instandard programs. The combined SPEPdocument combines several of the

ID Steps at a Glance Responsible Role 1 reenignEmargorPmaeTgniroliaThsilbatsE

2 maeTgniroliaTsenilediuGgniroliaTPACesU

3 EMSgniroliaTecapskroWgniroliaThsilbatsE

4 reenignEmargorPffokciKgniroliaTetucexE5 maeTgniroliaTscitsiretcarahCmargorPtceleS6 maeTgniroliaTsredlohekatSyfitnedI

7 maeTringoliaTsredlohekatSgniroliaTtnemucoD

8 maeTgniroliaTsnoisiceDgniroliaTekaM

9 maeTgniroliaTygetartSgninnalPtnemucoD10 reenignEmargorPWorkspaceringoliaTerutpaC

11 reenignEmargorPweiveRredlohekatSeludehcS

12 gniroliaT,reenignEmargorPlavorppAredlohekatSniatbOStakeholders

Figure 6: Steps at a Glance of the Tailoring Process6

“Key to this start-upprocess is the tailoring

of the CPA WI todescribe how each WI isgoing to be implemented

on the program ... theengineering plans mayhave been written in

draft form or written aspart of the proposal.”

Page 26: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Small Projects, Big Issues

26 CROSSTALK The Journal of Defense Software Engineering February 2008

required plans and averages around 45-60pages. The volume of the individual plansneeded for standard programs collectivelycan be in the hundreds of pages.Measurements are key concepts in highmaturity organizations. Regardless of thesize of the program (standard, small, ormicro) they need to collect the typicalEVMS metrics needed to analyze the fol-lowing: Schedule Performance Index (SPI)and Cost Performance Index (CPI). Thedifference, however, is the number of costaccounts that are tracked by small pro-grams are around 30 to 40, where in stan-dard programs the number of costaccounts can be in the hundreds ofaccount numbers. Naturally, on small pro-grams the number of data points collect-ed is quite limited. Neverless, there areother measurements recommended in thesmall program tailoring guidelines in addi-tion to CPI/SPI to manage these smallprograms. From the Tailoring Guidelines:

The metrics that a [small] programcollects is based on the characteris-tics of the program and customerrequirements. The following is arecommended list of metrics that asmall program should collectunless there is a valid reason to tai-lor out, such as:• CPI/SPI.• Defect Containment [which are

collected through peer review/inspection data and system troublereporting data].

• Requirements volatility.• Any (i.e. hardware, software, sys-

tems) applicable engineering pro-ductivity measures.7

The small program task managers indi-cate what measurements they use to man-age their programs in their MAIP. TheMAIP which identifies the program’s mea-surements requires approval by the EPG.Other adjustments are allowed for smallprograms during the tailoring process,such as the following:• Even though they started with 78

potential WIs, and since the disciplinessuch as hardware can be tailored out ifthere is no hardware development, thenumber of applicable WIs can getdown to about 20 and those can besubstantially tailored even further.

• Monthly review packages (35-40slides) are reduced to four squarecharts (three to four slides).

• Process Support Team (PST) meetingswhere metrics analysis and processcompliance checks are reduced both inthe team size and the time to conduct

the PST meetings.• The 10 required plans can be com-

bined with the SPEP, the implementa-tion of the various discipline plans arenot as detailed, and the associatedenablers and templates are alsoreduced in size.All the tailoring, even though much

reduced, still requires a significant effortby the small program’s management team.Therefore, the next step is for the EPGsand the senior level discipline councils toprovide even more pre-approved tailoringfor each of the WIs and the WIs associat-ed enablers. Going one step further, thefollow-on pre-tailoring activity will be toaddress micro-programs. These micro-programs are even smaller than the smallprograms I just discussed. The micro-pro-grams typically have work effort contentbetween 1,000 and 8,000 staff hours.Currently, there are a number of WIsbeing piloted by the RNCS organizationthat target these micro-programs. Yes, Iknow what you are thinking – what aboutthe projects under 1,000 hours ... smallprojects among big trees; ever think aboutwhere toothpicks come from?

SummaryIf we used the CMMI through MaturityLevel 5 itself as a process description, itwould be well over 600 pages. So whenyou establish your standard processesand actually write the organizationalprocess description you can see that allthe process-oriented documentation andimplementation of these processes canbecome an insurmountable issue forsmall programs (e.g., small budget, smallteam, short schedule). The CMMI rec-ognizes that tailoring is key to the suc-cessful implementation of the organiza-tional processes for the variety of pro-grams that organizations have in theirbusiness. It is up to the organization toprovide the guidance, methodology, andthe expectations of the small programsthat are being executed in the sameprocess environment as the big trees. Anorganization has to have an organiza-tional-approved tailoring method for thesmall programs in order to survive andbe successful, while still supporting theessence of the organizational processes.Tailoring is a life-saver for our small pro-grams that also support our overallCMMI Maturity Level 5.u

References1. SEI. “Process Maturity Profile, CMMI

V1.1 SCAMPISM v1.1.” Class AAppraisal Results 2007 Mid-YearUpdate. Sept., 2007.

2. SEI-Carnegie Mellon University,Published Appraisal Ratings. June,2007 <http://sas.sei.cmu.edu/pars/pars.aspx>.

Notes1. RNCS Common Process Architecture

Detailed Planning Work Instruction,#U0141CRP, dated 01/22/2007, pg 1.

2. RNCS Common Process ArchitectureDetailed Planning Work Instruction,#U0141CRP, dated 01/22/2007, pg 1.

3. RNCS Common Process ArchitectureDetailed Planning Work Instruction,#U0141CRP, dated 01/22/2007, pg 2.

4. RNCS Common Process ArchitectureDetailed Planning Work Instruction,#U0141CRP, dated 01/22/2007, pg 3.

5. RNCS Common Process ArchitectureTailoring the CPA Process WorkInstruction, #U0141CON, dated01/12/2007, pg 6.

6. RNCS Common Process ArchitectureTailoring the CPA Process WorkInstruction, #U0141CON, dated01/12/2007, pg 1.

7. CPA Tailoring Guidelines, Enabler,U0141H10, rev D, dated 10/05/2007,pg 11.

About the Author

Alan C. Jost (Lt. Col.,U.S. Air Force, Ret.) isa senior software pro-gram manager in theRaytheon Northeast Soft-ware Engineering Center

(SWEC), where he works multiple tasksincluding the Software EngineeringProgram Group Executive Committeefor SWEC’s CMM/CMMI Level 5 sus-tainment and as a process engineer onthe AutoTrac III Air Traffic ControlProduct Line and DD(X) ExCommsSoftware Cross Product Team. Jostjoined Raytheon after 20 years in the AirForce. He has served in a variety of linemanagement, process engineering, andsoftware task management roles inSWEC.

RaytheonMS 3-1-39141001 Boston Post RDMarlborough, MA 01752Phone: (508) 490-4282Fax: (508) 490-1366E-mail: alan_c_jost

@raytheon.com

Page 27: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Open Forum

February 2008 www.stsc.hill.af.mil 27

If you work for a small business, partici-pate on a small team, belong to a small

business unit, or work on small projects,then you work in a small setting and are like-ly familiar with the challenges of processimprovement in these contexts.

Consider the following example from asmall business: The Chief Executive Officercomplains that she does not have enoughcycles or resources to try out even one ofseveral process improvement concepts thather various customers are asking for. Sherecognizes the risks of inaction, but the costseems prohibitive, and she has never usedconsultants before. Her employees are allbusy and many believe that it’s always workedfine the way it is. She would like to have repeat-able, predictable work processes that pro-duce quality products and services andenable her company to stand out from thecrowd. She knows she needs some kind ofprocess improvement activity. But is hercompany ready? Is she ready? How can theybecome ready? What can they do them-selves, and how can they be better con-sumers of process improvement productsand services?

The SEI has heard stories like this fromsmall settings around the world, and beganto explore this arena in 2003 and 2004 witha pilot study in Huntsville, Alabama. Thestudy resulted in new knowledge and ideasfor how to accelerate implementation ofone improvement methodology – CapabilityMaturity Model Integration (CMMI®) – insmall businesses [1, 2]. This was followed byan insightful workshop in October 2005involving researchers from around theworld [3].

Based on this work and recommenda-tions from the International ProcessResearch Consortium, the SEI launched theImproving Processes in Small Settings(IPSS) project, in collaboration with Uni-versity of Pittsburgh Medical Center(UPMC) and Lockheed Martin Corporation(LMCO). Why would UPMC and LMCO –whose employees number in the tens orhundreds of thousands – be interested in aproject for small settings? Because, likemany large organizations, they are amalgams

of many small projects and business units,with myriad small business partners andsuppliers.

The first IPSS project is the Field Guidefor Improving Processes in Small Settings.The guide is not constructed like CMMI orany other process improvement models orframeworks; it is a collection of how-toguidance for process improvement in smallsettings, independent of the process modelor standard used. We intend it to help fast-track the improvement effort and conveythe scope of effort and skills involved ateach step so that the small-setting practi-tioner can be a smarter consumer ofprocess improvement products and servicesor be better at doing it themselves, whichev-er they choose.

The information in the field guide isorganized under six competencies: (1) build-ing and sustaining sponsorship and owner-ship; (2) developing and measuring realisticgoals; (3) developing and sustaining aprocess improvement infrastructure; (4)defining and describing processes; (5) devel-oping new or improved processes; and (6)determining improvement progress. Eachcompetency comprises a set of activitiesthat describe what to do to achieve that com-petency, and each activity comprises a set oftasks that explain how to do each activity.

Our plan for populating the field guideincludes collecting real-world experiencesfrom experts across the process communitywho can provide knowledge, techniques,examples, checklists, scripts, and other arti-facts to help others succeed in small settings.The guidance will include step-by-step tasksfor various situations and constraints of thesmall setting.

We welcome the involvement of smallsettings experts, citizens, and their stake-holders to help accelerate the developmentof the field guide. There are currently threeways to participate:• Become a project affiliate and work

directly on the guide at various stages.• Become a project sponsor, which

enables organizations to influence thepriority order of the guide’s developmentand content for their particular needs.

• Complete the brief survey we have cre-ated to collect information at <www.sei.cmu.edu/iprc/ipss.html>.For more information on the field guide

and the IPSS project, please visit <www.sei.cmu.edu/iprc/ipss.html>.u

References1. Chrissis, M., M. Konrad, and S. Shrum

et al. “CMMI: Guidelines for ProcessIntegration and Product Improvementv1.2.” Boston: Addison-Wesley, 2006.

2. Garcia, S. Highlights from PilotingCMMI With Two Small Companies.Proc. of the First International Re-searcher’s Workshop on Process Im-provement in Small Settings. Pittsburgh:SEI, Carnegie Mellon University (CMU),2006 <www.sei.cmu.edu/publications/documents/06.reports/ 06sr001.html>.

3. Garcia, S., Caroline Graettinger, andKeith Cost. “Proc. of the First Inter-national Researcher’s Workshop on Pro-cess Improvement in Small Settings.”Pittsburgh: SEI, CMU, 2006 <www.sei.cmu.edu/publications/documents/06.reports/06sr001.html>.

Field Guide to Provide Step-by-Step Examples for ImprovingProcesses in Small Settings

To help organizations in small settings pursue process improvement, the Software Engineering Institute (SEI) is inviting con-tributors to help develop a field guide with how-to guidance, examples, templates, checklists, and other information.

Caroline Graettinger, Suzanne Garcia,and William PetersonSoftware Engineering Institute

About the Authors

The authors are members of the IPSS pro-ject at the SEI and were part of theInternational Process Research Consor-tium that, from 2004-2007, brought togeth-er leaders from the international processcommunity to explore strategic researchdirections in software and systems process.Caroline Graettinger, Suzanne Garcia, andWilliam Peterson are senior members ofthe SEI CMU technical staff. ChristianCarmody is Director of Process andPerformance Improvement for UPMC. M.Lynn Penn is Director of Process Manage-ment at LMCO Integrated Systems andGlobal Services.

Carnegie MellonSoftware Engineering InstitutePhone: (412) 268-6109

Christian CarmodyUniversity of Pittsburgh Medical Center

M. Lynn PennLockheed Martin Corporation

Page 28: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Open Forum

28 CROSSTALK The Journal of Defense Software Engineering February 2008

Proceedings of the FirstInternational ResearchWorkshop for ProcessImprovement in SmallSettings, 2005www.sei.cmu.edu/publications/documents/06.reports/06sr001.htmlThe first International ResearchWorkshop for Process Improvement inSmall Settings was held October 19-20,2005 at the Software EngineeringInstitute in Pittsburgh, Pennsylvania.Attendees from Australia, Canada, Chile,China, Germany, Ireland, India, Japan,Malaysia, Mexico, Spain, and the UnitedStates discussed the challenges of processimprovement in small and medium sizeenterprises, small organizations withinlarge companies, and small projects. Thepresentations addressed starting and sus-taining process improvement, qualitativeand quantitative studies, and usingCapability Maturity Model Integration(CMMI), Agile, Modelo de Procesospara la Industria de Software,International Organization for Stan-dardization, Quality Function Deploy-ment, and Team Software Process insmall settings. The workshop also hadworking groups that discussed issuesunique to small settings, such as regionalsupport centers and process improve-ment on a shoestring. This reportincludes the papers from this workshopand presents conclusions and next stepsfor process improvement in small set-tings. This report also contains the work-shop breakout session results.

Extreme Programmingwww.extremeprogramming.orgExtreme Programming (XP) is a deliber-ate and disciplined approach to softwaredevelopment. About eight years old, ithas already been proven at many compa-nies of all different sizes and industriesworld wide. This web site gives an over-all view of XP and presents numerousresources for learning more about theprogramming method.

Software TechnologySupport Centerwww.stsc.hill.af.milIn 1987, the U.S. Air Force selectedOgden Air Logistics Center (OO-ALC),Hill Air Force Base, Utah, to establishand operate its Software TechnologySupport Center (STSC). It was chartered

to be the command focus for proactiveapplication of software technology inweapon, command and control, intelli-gence and mission-critical systems. TheSTSC provides hands-on assistance inadopting effective technologies for soft-ware-intensive systems. We help organi-zations identify, evaluate, and adopttechnologies that improve software prod-uct quality, production efficiency andpredictability. We help others buy andbuild software and systems better. Weuse the term technology in its broadestsense to include processes, methods,techniques, and tools that enhancehuman capability. Our focus is on field-proven technologies that will benefit theDepartment of Defense mission.

Software EngineeringInstitute’s CapabilityMaturity Model IntegrationWeb Sitewww.sei.cmu.edu/cmmiThe CMMI is a process improvementapproach that provides organizationswith the essential elements of effectiveprocesses. It can be used to guide processimprovement across a project, a division,or an entire organization. CMMI helpsintegrate traditionally separate organiza-tional functions, set process improve-ment goals and priorities, provide guid-ance for quality processes, and provide apoint of reference for appraising currentprocesses. This page points you to placeswhere you can find more informationabout CMMI, and describes the world-wide adoption and benefits of CMMI.

Practical Software andSystems Measurement www.psmsc.comPractical Software and SystemsMeasurement (PSM) was developed tomeet today's software and system techni-cal and management challenges. It is aninformation-driven measurement processthat addresses the unique technical andbusiness goals of an organization. Theguidance in PSM represents the bestpractices used by measurement profes-sionals within the software and systemacquisition and engineering communi-ties.

WEB SITES

Get Your Free Subscription

Fill out and send us this form.

517 SMXS/MXDEA

6022 Fir Ave

Bldg 1238

Hill AFB, UT 84056-5820

Fax: (801) 777-8069 DSN: 777-8069

Phone: (801) 775-5555 DSN: 775-5555

Or request online at www.stsc.hill.af.mil

NAME:________________________________________________________________________

RANK/GRADE:_____________________________________________________

POSITION/TITLE:__________________________________________________

ORGANIZATION:_____________________________________________________

ADDRESS:________________________________________________________________

________________________________________________________________

BASE/CITY:____________________________________________________________

STATE:___________________________ZIP:___________________________________

PHONE:(_____)_______________________________________________________

FAX:(_____)_____________________________________________________________

E-MAIL:__________________________________________________________________

CHECK BOX(ES) TO REQUEST BACK ISSUES:

NOV2006 c MANAGEMENT BASICS

DEC2006 c REQUIREMENTS ENG.

JAN2007 c PUBLISHER’S CHOICE

FEB2007 c CMMI

MAR2007 c SOFTWARE SECURITY

APR2007 c AGILE DEVELOPMENT

MAY2007 c SOFTWARE ACQUISITION

JUNE2007 c COTS INTEGRATION

JULY2007 c NET-CENTRICITY

AUG2007 c STORIES OF CHANGE

SEPT2007 c SERVICE-ORIENTED ARCH.

OCT2007 c SYSTEMS ENGINEERING

NOV2007 c WORKING AS A TEAM

DEC2007 c SOFTWARE SUSTAINMENT

JAN2008 c TRAINING AND EDUCATION

To Request Back Issues on Topics NotListed Above, Please Contact <[email protected]>.

Page 29: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

February 2008 www.stsc.hill.af.mil 29

Departments

Page 30: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

Departments

30 CROSSTALK The Journal of Defense Software Engineering February 2008

You’re InvitedCrossTalk, The Journal of Defense SoftwareEngineering, is celebrating its 20th anniversary.

From its inception, CrossTalk’s goal has beento inform and educate readers on software engineering processes, policies, and other

technologies. We love hearing from our readers andfriends on how we are doing. As a free journal,

these comments are the lifeblood of our existence:They keep our staff focused and our sponsors

motivated and pleased. If you find readingCrossTalk saves you time and money, helpsimprove your processes, has helped save your

project, or makes your life easier, we want to knowabout it. The comments we receive will be

considered for inclusion in our anniversary issuethis coming August.

Please send your feedback to CrossTalk’sPublisher, Beth Starrett at [email protected].

Crosstalk’s 20th Anniversary

Page 31: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

BACKTALK

February 2008 www.stsc.hill.af.mil 31

You’ve probably heard of the U.S.Navy’s famous Patrol Torpedo (PT)

boats of World War II. Originallyarmed with four machine guns and fourtorpedoes, they were pound-for-poundthe Navy’s most heavily armed vessels.The 80-foot wooden warships servedthroughout the Pacific, in theMediterranean, and even in the EnglishChannel. Their most famous actionsinclude evacuating General DouglasMacArthur from the Philippines andassisting John F. Kennedy’s exploitswhen his PT109 was cut in half by aJapanese destroyer. By the end ofWorld War II, the PT boats had rackedup an impressive list of accomplish-ments [1].

Why bring up PT boats inCrossTalk? Sure, the editors likestories framed with little-known bits ofmilitary history, but they’re an excellentanalogy for this issue’s Small Projects, BigIssues theme.

At the war’s inception, the PT boatretained the basic mission of all battle-ship-age torpedo boats: Use stealth andspeed to sink a capital ship using torpe-does. Initially, the enclosed waters ofthe Philippine and Solomon Islandsprovided opportunities to continuetheir historical mission. But opportuni-ties for surface combat would soonwane. Aircraft and radar made it muchmore difficult for the PTs to get closeenough to use their powerful stings.

Despite this, the PTs found them-selves even more in demand. Scouting,interdicting enemy barge traffic, per-forming reconnaissance, rescuingdowned flyers, giving close shore con-voy support, gathering intelligence,supporting ground operations, andmany other missions that came the PTs’way. They soon bristled with newweapons: auto-cannon, mortars, androckets. Some PTs abandoned their tor-pedoes entirely for more guns!Occasionally, though, they were stillable to slam a torpedo into a major war-ship, as they did during the Battle ofLeyte Gulf – the largest naval battle inmodern history [2].

Small projects can learn importantlessons from the small boats. First, thePTs were able to change their basic mis-sion of hunting capital surface ships by

adapting to fill an important niche inoverall naval strategy. Originallydesigned for a role in the big ship navy,the PT boats found that circumstancesdictated a very different one. Theyadapted well to their new role due inlarge part to the compactness of theboat and its crew. All teams must beable to adapt, but change comes moreeasily to smaller entities. This built-inflexibility allows small teams to operatewith less stringent operating processes,often much less.

More importantly, consider how aPT boat differed from other U.S. Navywarships. Table 1 compares a PT boatto a heavy cruiser. Both are warships,but the differences are striking. Crewtraining cannot be over-emphasized.Losing even one man on a small crewcould be devastating if another sailorcouldn’t take over. So every PT sailorhad to be able to do almost any job onthe boat, just like a small project soft-ware developer needs to be able to doany job on a project: requirements,design, coding, information and tech-nology, documentation, configurationmanagement, and even leadership.

For leadership, consider the PT boatskipper. He knows each sailor in hiscrew personally. His chain of com-mand: one executive officer. Before amission, he can muster the entire crewfor a direct, verbal briefing. He knowsthe complete capabilities of the boat:weapons, engines, communications, andperformance. From the cockpit, he canissue orders to any member of the crewverbally or via hand signal. He’s confi-dent his cross-trained crew can step inshould a shipmate be disabled. Theshort, focused PT boat missions allevi-ate him from the more mundane

aspects of captaincy. In battle, he oper-ates alone or with other PT boats, allwith the same modus operandi. Naval doc-trine of the day was developed for thelarger ships, but the realities of the PT’smissions made it clear that they neededdifferent methods, different tactics –different processes – to achieve that result.Aided by their compact nature, the PTcrews readily established their own suc-cessful processes.

If your project calls for a PT boat,put one in place and run it accordingly.Using processes and procedures estab-lished for the normal behavior expectedof large-scale projects can easily becounterproductive – or worse. Don’toperate a PT boat like a heavy cruiseror expect it to act like one. If you do,you’ll be sunk.

—Dan KnauerTLA

A Three-Letter Acronym Corporation

References1. Keating, Bern. “The Mosquito Fleet.”

Scholastic Book Services, USA: Dec.1971.

2. “Battle of Leyte Gulf.” Wikipedia<http://en.wikipedia.org/wiki/Battle_of_Leyte_Gulf>.

Additional ReadingThe Internet provides several excellentsources of information on the PT boatsincluding:1. The Historical Naval Ships Organi-

zation <www.hnsa.org>.2. PT Boats Inc <www.ptboats.org>.3. John Drain’s PT Boat Site <www.

pt-boat.com>.

Small Boats Among the Big Ships

Aspect PT Boat Heavy Cruiser

leetSdooWnoitisopmoCEngines Three Packard V-12 motors Boiler-driven turbines

liOenilosagnoitaivAleuFArmament ratio One weapon per man One weapon per 20 menTransport to operating area Carried aboard ship Arrived under own powerMission duration Nightly patrols returning to

same baseMulti-week cruisesreturning to varying ports

Crew training Cross-trained in two disciplines; familiar with all tasks on boat

Single assignment plusbattle station

Table 1: PT Boat Versus Navy Warship

Page 32: Small Projects, Big Issues 4 · Small Project Survival Among the CMMI Level 5 Big Processes Jost examines how his organization successfully learned to tailor its CMMI Level 5 processes

CrossTalk / 517 SMXS/MXDEA6022 Fir AVEBLDG 1238Hill AFB, UT 84056-5820

PRSRT STDU.S. POSTAGE PAID

Albuquerque, NMPermit 737

CrossTalk isco-sponsored by the

following organizations: