2 august 2017 prof jeff craver jeffrey.craver@dau of the remaining 36 current programs…only 4...
TRANSCRIPT
So you are Conducting a Technology Readiness Assessment? What to Know
2 August 2017
Prof Jeff Craver
Agenda items
• Challenges
• Statutory Requirement
• MDAPs
• TMRR Phase
• DRFPRDP
• Independent Review Panel (IRP)
• Technology Readiness Assessment and TRLS
• Critical Technologies (CT) – Selection and Assessment
• Demonstrated in a Relevant Environment
• Approval by the Component Acquisition Executive (CAE). Reviewed by ASD(R&E) and DASD (DT&E)
2
GAO: Reasons Programs Fail (or experience significant cost and schedule impacts
1. Poor requirements management
2. Immature Technologies
3
Higher total MDAP RDT&E funding growth since original
baselines
4Performance of the Defense Acquisition System: 2016 Annual Report
U.S. Department of Defense
WHY?
Program Development with Immature Technologies
5
Of the remaining 36 current
programs…only 4 reported that all
their critical technologies were
matured to best practice standards
when they began development.11
Another 11 programs reported
having all critical technologies
nearing maturity prior to system
development. The remaining 21
programs reported either having
one or more immature technologies
at the start of development at this
critical point in the acquisition
process.
GAO Technology Readiness Assessment Guide
Example: Gerald R. Ford Class Nuclear Aircraft
Carrier (CVN 78)
6
maintaining design stability
depends on currently
immature technologies fitting
within the space, weight,
cooling, and power
reservations allotted them
within the ship. Construction
to date has been impeded by
critical technology system
delays, material shortages,
and engineering challenges.
Costs have grown by over 22 percent,
March 2014
DEFENSE ACQUISITIONS
Assessments of Selected Weapon
Programs
GAO Technology Readiness Assessment Guide
Technology Maturation as a Cost Driver
7
Product
development and
associated
technologies
TRL at program
initiation
Cost growth Schedule delay
Comanche helicopter
• Engine, rotor,
FLIR, helmet
mounted display,
avionics
5, 5, 3, 3, 3 101 percent 120 percent
Ford Jaguar
automobile
Adaptive cruise control,
voice activated controls
8, 8 None None
GAO Technology Readiness Assessment Guide
DODI 5000.02 Jan 7, 2015 as amended Feb 2, 2017
TRA shall be conducted:
• On Critical Technologies (CT) demonstrated in a relevant
environment
• In the TMRR Phase to reduce technical risks associated
with CTs*
• As a statutory requirement on all MDAPs prior to the
DRFPRDP
• By an independent panel appointed by the PM and
reviewed by ASD(R&E) and DASD (DT&E)
8* Prior to MS C if program is entering lifecycle in EMD
Table 2. Milestone and Phase Information Requirements
ENCLOSURE 1 ACQUISITION PROGRAM CATEGORIES AND COMPLIANCE REQUIREMENTS
Change 2, 02/02/2017
(4) TMRR Phase
(b) Phase Description
3. There are a number of ways to structure this phase which should be tailored to reduce
the specific risks associated with the product being acquired. Technology Readiness
Levels, described in the Technology Readiness Assessment (TRA) Guidance (Reference
(e) (f) ), should be used to benchmark technology risk during this phase; however, these
indices are rough benchmarks, and not conclusive about the degree of risk mitigation
needed prior to development. Deeper analysis of the actual risks associated with the
preferred design and any recommended risk mitigation must be conducted and provided
to the MDA.
d. Acquisition Process Decision Points and Phase Content.
5. PROCEDURES
DODI 5000.02 January 7, 2015 Incorporating Change 2, Effective February 2, 2017
(e) (f) Assistant Secretary of Defense for Research and Engineering Guide, “Technology Readiness
Assessment (TRA) Guidance,” April 2011, as amended
(https://acc.dau.mil/CommunityBrowser.aspx?id=18545)
REFERENCES
Everywhere TRA appears in DODI 5000.02 Jan 7, 2015 as amended Feb 2, 2017
DODI 5000.02 Jan 7, 2015 as amended Feb 2, 2017
MDAPs and non-MDAPs Decision Authority
• MDAP – ACAT I program or as determined by the MDA
– ACAT 1D (DAE)– DAE or as delegated
– ACAT 1C (Component)– Head of the DoD Component or CAE
– ACAT IAM – DAE
– ACAT IAC - Head of the DoD Component or CAE
• MDAs for non-ACAT I programs should consider requiring
TRAs for those programs when technological risk is
present.
– ACAT II – CAE
– ACAT III – Designated by the CAE
– ACAT IV - Navy and Marine Corps only
10DODI 5000.02 enclosure 1
TRAs for the ASD(R&E) are not required
for Major Automated Information System
(MAIS) programs, non-MDAPs, or MDAP
Milestone C (MS C) decisions, except for
MDAPs entering the acquisition system
at MS C.
TMRR Phase
PURPOSE: To reduce technology, engineering, integration,
and life cycle cost risk to the point that a decision to contract
for EMD can be made with confidence in successful
program execution for development, production, and
sustainment.
11DODI 5000.02 Jan 7, 2015 as amended Feb 2, 2017
Development RFP Release Decision
Development RFP Release Decision is extremely important − Program will either successfully lead to a fielded capability or fail, based on the soundness of the
capability requirements, the affordability of the program, and the executability of the acquisition
strategy.
− Authorizes the release of RFPs for EMD and often for Low-Rate Initial Production (LRIP) options
(For MDAPs and major systems, the MDA will determine the preliminary LRIP quantity at this
decision point).
− Last point at which significant changes (i.e. requirements) can be made without a major disruption.
− Timing of the PDR relative to Development RFP Release Decision is at Component
discretion.
Source: Figure 7,
DoDI 5000.02 Change
2 of Feb 2, 2017
DODI 5000.02 Jan 7, 2015 as amended Feb 2, 2017
Phased Acquisition Cycle with Decision Points
13
For Programs that do not have a MS BTechnology Maturity Assessment (Knowledge Building TRAs)
GAO Technology Readiness Assessment Guide
Independent Review panel (IRP)
Panel Membership Dependent on Technologies Involved
• Independent and therefore, not a member of the program IPT or any specialized interest group
• Recognized expert with proven experience (PM concurrence)
• Senior SME that has authority to speak for Dept/Command concerning respective technology
• No personal gain for program success or failure
• Has current and appropriate clearance level
• Will proactively work to de-conflict schedule as program plans change
– Maintain consistent IRP throughout duration of TRA
14Copeland NAVAIR TRA_TMA Process Training Brief Apr 2016
Technology Readiness Assessment (TRA): A systematic metrics-based
process that assesses the maturity of, and the risk associated with,
critical technologies.
• Preliminary assessment required for Major Defense Acquisition
Programs (MDAPs) in support of Development RFP Release
Decision Point.
Technology Readiness Levels (TRLs):
• Use: Adopted by DoD as a method of estimating technology maturity during
the acquisition process. The TRL scale is measured from 1 to 9 with 1
being the least mature technology and 9 being the most mature.
• Caution: The TRL scale is a management tool. It does not mean that the
product technology readiness level is accurate or even that it has gone
through any official measurement process.
Technology Readiness Assessment
For more information see DAU
Continuous Learning Module CLE 021,
Technology Readiness Assessments
DODI 5000.02, Table 2
What is a TRA?
• A TRA is an evaluation of the maturity of critical elements of a product’s technologies, often called
critical technologies. It is a normal outgrowth of the system engineering process and relies on data
generated during the course of technology or system development.
• The TRA frequently uses a maturity scale—technology readiness levels (TRLs)—that are ordered
according to the characteristics of the demonstration or testing environment under which a given
technology was tested at defined points in time.
• Technology Readiness Levels (TRLs) can serve as a helpful knowledge-based standard
and shorthand for evaluating technology maturity, but they must be supplemented with
expert professional judgment. (Technology Readiness Assessment (TRA) Guidance April 2011)
16GAO Technology Readiness Assessment Guide
Why Are TRA’s Important?
• High quality evidence-based TRAs provide managers and governance bodies with important
information for making technical and resource allocation decisions on whether a technology or
system is sufficiently mature to move past a decision point to the next acquisition phase, needs
additional work, or should be discontinued or reconsidered in favor of more promising technology.
• The TRA results—in the form of a TRA report—also serve as input to other program management
decisions to estimate cost, schedule, and risk. Importantly, TRAs provide a common language and
framework or reference point to facilitate dialogue supported by well-defined metrics and methods
across organizational disciplines, departments, and business functions.
Page 17GAO Technology Readiness Assessment Guide
Technology Readiness Assessments (TRA) vs
Technology Maturity Assessment (TMA)
• TRA is a formal requirement for MS B conducted by an
independent review team outside the influence of the PM
specific to Critical Technologies
• TMA is a periodic assessment of the ongoing maturity of the
technology in preparation for an upcoming TRA and as
knowledge points across the program management and risk
management spectrum conducted at the discretion of and by
the PM
18
A technology is considered “critical” if:
It proposes a significant risk to the success of the program especially relating to KPPs
and KSAs Copeland NAVAIR TRA_TMA Process Training Brief Apr 2016
New and Novel?
GAO Technology Readiness Assessment Guide 2016
19
The Guide has two purposes:
(1) describe generally accepted best practices for conducting effective evaluations of technology developed for systems or acquisition programs, and
(2) provide program managers, technologydevelopers, and governance bodies with the tools they need to more effectively mature technology, determine its readiness, and manage and mitigate risk.
20
21
TRA Completed
Four Characteristics of a High Quality TRA
• Credible - Assessment design, execution, and reporting activity reflects
understanding of requirements, critical technologies, relevant or
operational environments; assessment team has right knowledge and
expertise
• Objective - Assessment is based on objective, relevant and trustworthy
data, analysis, and information; free from internal and external
organizational bias or influence
• Reliable - Uses disciplined processes that facilitate repeatability,
consistency, and regularity
• Useful – Stakeholders understand information; it has sufficient detail
and is timely and can be acted upon
22GAO Technology Readiness Assessment Guide
Six Steps to Develop a High Quality TRA
1. Design TRA Strategy
2. Define Purpose, Develop Plan, and Assemble Team
3. Select Critical Technologies
4. Evaluate Critical Technologies
5. Prepare and Submit the TRA Report
23
6. Use TRA
Results and
Develop a
Technology
Maturation Plan
MDAPs - The plan for conducting a TRA is
provided to the ASD(R&E) by the PM upon
approval by the Component Acquisition
Executive (CAE). DODI 5000.02
GAO Technology Readiness Assessment Guide
Best Practice - Selecting Critical Technologies
• Critical technologies should be rigorously and objectively identified and documented to ensure
the evaluation is objective and reliable, and the information is useful.
• There are 4 steps that should help organizations ensure that the process for selecting critical
technologies is reliable
24
Step 3: Select Critical Technologies
GAO Technology Readiness Assessment Guide
A rigorous, objective, reliable, and documented
approach, based on the WBS or other key
program documents was used to initially identify
critical technology candidates.
The intended operational environment was
considered, including potential adverse
interactions with systems which the technology
being developed must interface.
A relevant environment was derived for each
critical technology from those aspects of the
operational environment that is determined to be
a risk for the successful operation of that
technology.
Critical technologies were initially selected
following a reliable process that is disciplined and
repeatable with defined criteria using increasingly
platform- or program-specific questions and
requirements.
Critical technologies were defined at a level that
is testable, which could include the software
needed to demonstrate their functionality.
The assessment team documented the reasons
why technologies were selected as critical,
including reasons why other technologies were
not selected.
The number of critical technologies chosen for
assessment was not arbitrary but was based on
solid analysis using the WBS, process flows, or
other technical documentation.
When significant program changes occurred,
critical technologies were reassessed possibly
causing some to be added or removed from the
list of critical technologies.
Subject matter experts with appropriate and
diverse knowledge selected and reviewed the
critical technologies.
BEST PRACTICE CHECKLIST: SELECTING CRITICAL TECHNOLOGIES
25GAO Technology Readiness Assessment Guide
Critical technology (CT)
• The adjective “critical” has several applications and context when
referencing technology
• – Mission Critical Technology List
• – Critical Program Information
• – Critical for Mission Success
• – Critical Safety Items
• The term Critical Technology is used to uniquely identify immature
technology that introduce significant risks to program
26
CT
A technology is considered “critical” if:
(1) the system being acquired depends on this technology element to meet operational
requirements (within cost and schedule limits), and
(2) if the technology element or its application is either new or novel or in an area that poses
major technological risk during detailed design or demonstration – no longer in definition
Copeland NAVAIR TRA_TMA Process Training Brief Apr 2016
Criteria for Determining CT
• Impact to program success
– Has the technology been modified?
– Has the technology been repackaged such that a new and more
stressful relevant environment is realized?
– Is the technology expected to operate in an environment and/or
achieve a performance expectation beyond it’s original design
intention or demonstrated capability?
27Example set of criteria – list to be developed and agreed to by MDA, PM, Independent Review Panel
Copeland NAVAIR TRA_TMA Process Training Brief Apr 2016
Best Practice - Selecting CTs
Challenges in Selecting Critical Technologies
• Program officials sometimes disregard critical technologies when they have
longstanding history, knowledge, or familiarity with them. This is problematic when these
technology elements are reapplied to a different program or operational environment,
particularly when being used in a novel way.
• The process to collect evidence for identifying critical technologies can be
straightforward, the determination for what constitutes a critical technology is highly
subjective, requiring knowledge, experience, and due professional care. For example,
judgements need to be made about what a technology is, what makes a technology
critical, and at what level. Correctly identifying and selecting critical technologies can
prevent wasting valuable resources—funds, capital acquisitions, and schedule—later in
the acquisition program.
28GAO Technology Readiness Assessment Guide
Best Practice - Evaluating Critical Technologies
• There are 4 steps that can help organizations ensure that an evaluation is objective and
reliable by applying a disciplined and repeatable process. These steps can be tailored to
accommodate organizational structures, processes, and policies.
29GAO Technology Readiness Assessment Guide
CT Assessment Criteria (Readiness)
• Technologies have been demonstrated in a relevant environment and whether risk has been reduced or can be reduced to an acceptable level for inclusion in an EMD program
Is that it?
• DAU S&T CoP – Decision Point QuicklookAssessment Instruments – Tailored for a TRA by IRP/PM
30
If someone told
you this
technology was
TRL 6, would that
be enough to
convince you
that the risk was
adequately
mitigated? I hope
not.
Defense AT&L: September–October
2013 - The Trouble with TRLs
– Frank Kendall
https://www.dau.mil/cop/stm/Pages/Topics/Best%20Prac
tices%20Lessons%20Learned%20and%20Tools.aspx
Limitations of TRA and TRLs
31GAO Technology Readiness Assessment Guide
Demonstrated in a Relevant Environment
• Adequate demonstration in a relevant environment (TRL 6) is one benchmark that is evaluated, but it is not the only consideration, nor necessarily dispositive
• Demonstration: An element of verification that involves the actual operation of an item to provide evidence that the required functions were accomplished under specific scenarios. The items may be instrumented and performance monitored. (Mil-STD 961E)
• Relevant Environment – Defined by who? Customer, User, Requirements community, D,OT&E?
– Before the assessment process begins, the SME team must ensure a sufficient understanding of the requirements, identified capabilities, system and software architectures, CONOPS, and/or the concept of employment to define the relevant environments. The SME team must also ensure that its understanding of design details is sufficient to evaluate how the technologies will function and interface. (DOD Technology Readiness Assessment (TRA) Guidance April 2011)
32
Required by WSARA (2009)
Reviewed by ASD(R&E) and DASD (DT&E)
• The plan for conducting a TRA is provided to the ASD(R&E) (FY 18 –
TBD) by the PM upon approval by the Component Acquisition Executive
(CAE).
• A TRA is required by Department of Defense Instruction (DoDI) 5000.02
for MDAPs at MS B (or at a subsequent Milestone if there is no MS B).
It is also conducted whenever otherwise required by the MDA.
• TRAs may have to be performed on all the competitors’ proposals in a
competitive source selection.
33DOD Technology Readiness Assessment (TRA) Guidance April 2011
Conclusions
• Required for MDAPs (ACAT I or as designated) –
recommended for ACAT II – III
• TRA should be a tailored process using best practices
• TMAs should be conducted regularly in prep for TRA
• Relevant environments defined by customer for TRL 6 to be
valid
• Independent review team members could come from almost
anywhere not under the influence of the PM
34
References
• DODI 5000.02 Jan 7, 2015 as amended Feb 2, 2017
• DOD Technology Readiness Assessment (TRA) Guidance April 2011
• GAO Best Practices Technology Readiness Assessment Guide
• Copeland NAVAIR TRA_TMA Process Training Brief Apr 2016
• Mil-std 961E
• Performance of the Defense Acquisition System: 2016 Annual Report U.S.
Department of Defense
• Defense Acquisitions Assessments of Selected Weapon Programs - March 2014
35
GAO Points of Contact
• Mr. John Ortiz – Project Manager, IT• [email protected]
• GAO Products• www.gao.gov