executive series addp 00.4 operational evaluation€¦ · 1.4 evaluation has become a well...

114
ADDP 00.4 EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION Australian Defence Doctrine Publication 00.4—Operational Evaluation is issued for use by the Australian Defence Force and is effective forthwith. A.G. HOUSTON, AO, AFC Air Chief Marshal Chief of the Defence Force Australian Defence Headquarters Canberra ACT 2600 01 August 2007

Upload: others

Post on 13-Jun-2020

7 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

EXECUTIVE SERIES

ADDP 00.4

OPERATIONAL EVALUATIONAustralian Defence Doctrine Publication 00.4—Operational Evaluationis issued for use by the Australian Defence Force and is effective forthwith.

A.G. HOUSTON, AO, AFCAir Chief MarshalChief of the Defence Force

Australian Defence HeadquartersCanberra ACT 2600

01 August 2007

Page 2: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

© Commonwealth of Australia 2007

This work is copyright. Apart from any use as permitted under the Copyright Act 1968, no part may be reproduced by any process without prior written permission from the Australian Government Department of Defence.

Announcement statement—may be announced to the public.

Secondary release—may be released to the public.

All Defence information, whether classified or not, is protected from unauthorised disclosure under the Crimes Act 1914. Defence information may only be released in accordance with the Defence Security Manual and/or Defence Instruction (General) OPS 13–4—Release of Classified Defence Information to Other Countries, as appropriate.

ADDP 00.4First edition 2007

Sponsor Vice Chief of the Defence Force

Russell OfficesCANBERRA ACT 2600

Developer CommandantAustralian Defence Force Warfare CentreWILLAMTOWN NSW 2314

Publisher Director Defence Publishing ServiceDepartment of DefenceCANBERRA ACT 2600

Defence Publishing ServiceDPS: October 2007

Page 3: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

FOREWORD1. Australian Defence Doctrine Publications (ADDP) and Australian Defence Force Publications (ADFP) are authorised joint doctrine for the guidance of Australian Defence Force (ADF) operations. ADDP are pitched at the philosophical and high-application level, and ADFP at the application and procedural level. Policy is prescriptive as represented by Defence Instructions, and has legal standing. Doctrine is not policy and does not have legal standing, however, it provides authoritative and proven guidance, which can be adapted to suit each unique situation.

2. ADDP–D—Fundamentals of Australian Military Doctrine identifies that a knowledge edge is one of the ADF’s key warfighting concepts. ADDP–D also states that the ADF needs to be able to fight smart to compensate for its small size. The three elements of the knowledge edge—information, knowledge and decision making—contribute to the development of a qualitative edge. One of the key elements that underpin the development of a qualitative edge is operational evaluation (OE). OE is an essential means by which the ADF can efficiently, effectively and economically transform what is known, in the form of information and intellectual assets, into enduring value, producing returns for the overall enhancement to capability and preparedness.

3. The effectiveness of OE as a change management and improvement tool is dependent on the transfer of knowledge and experience gained from the full range of Defence activities, including operations, international engagement activities, collective training, individual training, trials and experiments. Preparation for all ADF activities should include the systematic review of previously recorded lessons to ensure that knowledge and experience is applied so that the inefficiency of repeating past mistakes is avoided.

4. The aim of this publication is to describe the nature and scope of OE of ADF activities and how OE outcomes can be used to enhance capability and preparedness. This publication provides a philosophical level reference and framework for the conduct of OE and is written for use by commanders, joint planning staffs, OE staffs and all personnel involved in the conduct of OE. The publication is applicable to all levels of command.

iii

Page 4: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 5: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

AMENDMENTSProposals for amendment of ADDP 00.4 may be initiated in either of the following ways:

• By Minute to:

Director Doctrine and TrainingAustralian Defence Force Warfare CentreRAAF BaseWILLIAMTOWN NSW 2314

• By directly entering comment into the Joint Doctrine Development Environment (JDDE) found on the Australian Defence Force Warfare Centre (ADFWC) Defence Restricted Network (DRN) website (see http://intranet.defence.gov.au/VCDFweb/sites/adfwc/). Select JDDE on the ADFWC homepage and open either the ADDP or ADFP block as required. Open the relevant publication and utilise the ‘Add Comment’ function button of the summary page for each publication.

Note

The second option is an addition to encourage feedback from the wider ADF, as well as encouraging use of the JDDE in general.

DOCTRINE PUBLICATION HIERARCHY

The hierarchy of ADDP and ADFP and the latest electronic version of all ADDP and ADFP are available on JDDE found on the ADFWC DRN website located at http://intranet.defence.gov.au/VCDFweb/sites/adfwc/.

This publication is current as at August 2007.

This publication will be periodically reviewed and amended. The latest version of this publication is available on the ADFWC DRN website http://intranet.defence.gov.au/VCDFweb/sites/adfwc/.

v

Page 6: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 7: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

CONTENTSPage

Foreword iiiAmendments vContents viiList of Figures xi

Paragraph

CHAPTER 1 INTRODUCTION TO OPERATIONAL EVALUATION

Introduction 1.1Background 1.2Evaluation theory and practice 1.5Description of operational evaluation 1.10Requirement for operational evaluation 1.18Operational evaluation knowledge 1.21Capability and preparedness enhancement 1.26Operational evaluation environment 1.34Related activities 1.38

Annex:A. Evaluation forms and approaches

CHAPTER 2 RESPONSIBILITIES

Introduction 2.1Responsibilities 2.4Operational evaluation organisations and tools 2.11

CHAPTER 3 OPERATIONAL EVALUATION FRAMEWORK

Introduction 3.1Standards 3.2Operational evaluation model 3.8Leveraging knowledge capital 3.17Operational evaluation culture 3.18

CHAPTER 4 IDENTIFYING THE REQUIREMENT

Introduction 4.1Requirements 4.2Initiating operational evaluation requirements 4.8

Annex:A. Operational evaluation initiating

instruction—outline format

vii

Page 8: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

CHAPTER 5 PLANNING

Introduction 5.1Planning 5.4Planning process 5.6Collection plan 5.10

Annexes:A. Operational evaluation planning check listB. Operational evaluation plan outline

CHAPTER 6 CONDUCT

Introduction 6.1Operational evaluation team 6.3Conduct activities 6.6Operational evaluation briefing 6.8Data collection 6.9Preliminary analysis and reports 6.11Out-briefing 6.12Detailed analysis 6.13Final reporting 6.17

CHAPTER 7 IMPLEMENTATION, MONITORING AND REVIEWING

Introduction 7.1Temporal aspects 7.5Implementation 7.6Monitoring 7.8Reviewing 7.10

CHAPTER 8 OPERATIONAL EVALUATION KNOWLEDGE DOMAIN

Introduction 8.1Building knowledge 8.4Operational evaluation knowledge components 8.5Characteristics 8.6

CHAPTER 9 META–EVALUATION

Introduction 9.1Forms and purpose 9.2

Annex:A. Meta–evaluation guide

viii

Page 9: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

CHAPTER 10 AUSTRALIAN DEFENCE FORCE OPERATIONAL EVALUATION SYSTEM

Introduction 10.1System description 10.2Australian Defence Force Activity Analysis Database

System overview 10.5Australian Defence Force Activity Analysis Database

System management and training 10.9

Annex:A. Australian Defence Force activity analysis

database system user guide

Glossary

Acronyms and Abbreviations

ix

Page 10: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 11: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

LIST OF FIGURESFigure Title Page

3–1 Operational evaluation model 3–5

4–1 Operational evaluation model—identifying the requirement 4–2

5–1 Operational evaluation model—planning 5–25–2 The operational evaluation planning progress—joint

military appreciation process modified 5–4

6–1 Operational evaluation model—conduct 6–2

7–1 Operational evaluation model—implement, monitor and review 7–2

7–2 Monitoring operational evaluation—two stages 7–5

xi

Page 12: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 13: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

CHAPTER 1

INTRODUCTION TO OPERATIONAL EVALUATION 1

Introduction 1.1

1.1 The purpose of this chapter is to introduce and describe OE in the ADF. OE is an integral element of the ADF’s capability and preparedness enhancement systems and processes, and is one of the key levers by which the ADF achieves a knowledge edge1. OE is applicable to operations, sustainment and all activities that prepare the ADF to undertake operations.

Background 1.2

1.2 Learning from the successes and mistakes of the past (your own and of others) to improve performance next time is an intuitive human activity. This form of learning is the basic motivation for evaluation. As individuals, we evaluate, both formally and informally, a range of matters that occur in our everyday lives as part of our decision making. In doing so we intuitively follow basic steps that involve assembling information, identifying our criteria for making assessments, measuring performance against our criteria, making judgements and taking action.

Executive summary

• Operational evaluation (OE) involves the conduct of assessments to identify lessons, gather and use knowledge and take actions to enhance capability and preparedness.

• OE enables individuals, groups and the Australian Defence Force (ADF) as a whole to learn from experience in a systematic manner.

• Preparedness management systems and capability development processes are dependent on OE for contemporary information on the performance of the force in being (FIB).

• The attainment of a knowledge edge is a significant outcome of a robust OE regime.

• OE is an integral component of the planning and conduct of all ADF activities.

1 Gaining and exploiting a knowledge edge is one of the key concepts influencing the way the ADF conducts operations and campaigns. Knowledge edge is discussed in Australian Defence Doctrine Publication (ADDP)–D—Foundations of Australian Military Doctrine, chapter 6.

1–1

Page 14: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

1.3 Not surprisingly, these basic steps are reflected in the more formal evaluation theories and practices applied to a range of civilian and military circumstances. Evaluation is a widely used management tool that enables managers to find out how their organisations are performing, if objectives are being met and business plans are being achieved, and how improvements might be made. In the military context, evaluation is much more than a management tool, as it assists commanders to:

• mitigate risks associated with military activities that may have lethal consequences;

• optimise the performance of valuable and scarce resources (through the adoption of superior tactics, techniques and procedures);

• maintain an intellectual advantage over adversaries;

• make better decisions faster;

• develop superior military capabilities; and

• optimise the preparedness of the FIB.

1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and principles. There is a significant body of academic work that has examined various approaches to evaluation, methods and techniques. This academic work assists our understanding of evaluation, but does not limit the ADF in terms of the approaches, methods and techniques that it uses. The ADF selects the appropriate approach, method and technique to suit the military circumstances.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Training. Superior training and skilful use of modern weapons was an essential component in the ADF’s successful engagement of Iraqi forces.

Defence will continue to invest in best practice training, and in equipping soldiers with the weapons and resources required for success.

1–2

Page 15: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

Evaluation theory and practice 1.5

1.5 OE in the ADF is based on and adapted from civilian evaluation theory and practice. Common objects for evaluation include policies, programs, products, and individuals.

1.6 Logic of evaluation. Evaluation, a process of making a judgement about the value or worth of something, is underscored by a logic that is described as follows:

• Establishing the criteria of worth—what tasks must the subject of evaluation2 do well?

• Constructing standards—how well should the subject perform?

• Measuring performance and comparing standards—how well did the subject perform?

• Synthesising and integrating evidence into a judgement of merit or worth—what is the worth of the subject?

• Applying the knowledge gained—what actions should the subject and other like subjects take, and what other relevance and use does the knowledge gained have?

1.7 Evaluation activity based on making judgements of worth is a subset of a broader endeavour, which is evaluation as the production of knowledge based on systematic enquiry to assist decision making. This explanation recognises that evaluation involves more than designing and conducting empirical studies. The work of ADF OE evaluators includes additional tasks, including:

• review and analysis of previous findings;

• negotiating an evaluation plan with stakeholders, and identifying key issues for examination;

• implementing an evaluation plan including collecting and analysing evidence to produce findings;

• disseminating findings to enhance understanding of the subject or allow judgements or decisions to be made and implemented; and

• assisting with the implementation of findings.

2 The subject of evaluation may be processes, policies, equipment, employment groups and so on.

1–3

Page 16: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

1.8 Forms and approaches. Historically evaluation models used as a point of entry the terms formative evaluation (evaluation of a program in progress, where the focus is on process and outputs) and summative evaluation (evaluation conducted after the event, where the focus is on outcomes). Whilst these terms make a useful distinction in terms of the timing of an evaluation, they fail to adequately define the forms and approaches that might be used and more importantly, the reason for conducting evaluation. More recent evaluation models recognise that there are many approaches, methods and techniques available to evaluators in a range of combinations. These combinations have been distilled into five evaluation forms linked to major approaches that are consistent with each form. See annex A for a summary. These forms are as follows:

• Proactive. Proactive evaluation takes place before a program or activity has been designed. This form places evaluators in the position of adviser, conducting research such as a needs assessment, review of best practice or review of prior experience, to provide commanders with advice on options.

• Clarificative. Clarification evaluation reviews the logic of a program that has commenced. This form looks at the intended outcomes and identifies elements that need to be modified to achieve those outcomes.

• Interactive. In interactive evaluation the emphasis is on evaluators working with those performing the task to review and improve aspects such as management arrangements, processes and procedures. This form is conducted when there is a clear need for the commander and staff to be in control of the evaluation.

• Monitoring. This form of evaluation monitors the performance of an established program to determine if it is on track, and may involve regular monitoring of the program and the use of tools such as performance indicators.

• Impact. Impact evaluation assesses the effectiveness of a program that has concluded or one that is ongoing and has reached a significant milestone or the end of a cycle. This form may focus on outcomes and/or review the arrangements for establishing the program.

1.9 These forms and the associated approaches are all applicable to OE, and it may assist those initiating OE tasks and evaluators, to identify the appropriate forms and approaches when initiating and planning an OE task, so that they recognise the context and purpose of an OE activity. See chapter 4—‘Identifying the requirement’ and chapter 5—‘Planning’ for more details.

1–4

Page 17: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

Description of operational evaluation 1.10

1.10 OE is based on the principle academic definition of evaluation, in that it establishes the merit of something, but also includes taking action to benefit from that knowledge. OE includes:

• identifying OE requirements based on analysis of capability and preparedness,

• conducting systematic assessments of the OE subjects,

• identifying and learning lessons,

• gaining and using OE knowledge, and

• taking actions to enhance capability and preparedness.

1.11 OE offers the ability to think through in a logical and ordered manner, the dilemmas of the battlespace and the value placed on them, and the resolution of, those dilemmas. OE is conducive toward an objective, analytical approach and the efficient and effective employment of resources, to enhance ADF capability and preparedness. Strategic plans establish OE priorities in accordance with capability and preparedness enhancement requirements and provide guidance to commanders initiating specific OE requirements.

1.12 Objective and/or qualified subjective assessments establish the effectiveness and efficiency of OE subjects and the ADF, and provide recommendations to sustain or improve performance. OE requires the use of qualitative and quantitative methods and professional military judgement to reach conclusions about military activities. For OE assessments and outcomes to have validity the ADF needs to conduct OE in accordance with accepted standards and principles, policy and doctrine.

1.13 The identification and learning of lessons, and the taking of necessary remedial action, is central to OE. Academically, the identification and learning of lessons often signifies the end of the evaluation process. However, OE requires the identification of action(s) to sustain the lesson from a given situation. Lessons are by definition learned, although the extent of immediate learning may be limited and in a large organisation like the ADF, a lesson may not be fully recognised as being learned until action is taken to incorporate the lesson into military capability (for example via changes to doctrine, training or other capability aspects). The accessibility to the knowledge capital is pivotal in the learning of the lesson and sustaining the knowledge gained. The lesson may then be reused by applying its principles to a similar situation, thereby either preventing a relearning of the lesson or resolving an issue.

1–5

Page 18: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

1.14 OE is conducted at the strategic, operational and tactical levels of command across single-Service, joint, multinational and multi agency arenas. The OE principles and processes can be applied to any level or scale of activity, and should not be constrained by time, location or situation.

1.15 There are other forms of evaluation conducted within Defence that are similar in some respects to OE, such as:

• portfolio evaluation (see paragraph 1.16);

• evaluation of individual training, as a phase of the Defence Training Model (see Australian Defence Force Publication 7.0.2—The Defence Training Model); and

• Defence test and evaluation, which is focused on capital equipment, see Defence Instruction (General) (DI(G)) OPS 43–1—Defence Test and Evaluation Policy.

1.16 OE complements but is separate to the performance management and monitoring responsibilities that Defence managers and commanders have under the Financial Management Accountability Act 1997. The Act charges Defence with managing its performance and ensuring continuous performance monitoring is complemented by periodic evaluation activities. The Inspector-General of Defence has been directed to conduct portfolio level evaluations that address the appropriateness, effectiveness and efficiency of key Defence functions and activities. For additional information on portfolio evaluation refer to DI(G) ADMIN 16–1—Portfolio evaluation and the Defence Evaluation Manual.

1.17 All Defence evaluation activities contribute to the growth of knowledge that informs capability and preparedness and will often be complementary. The sharing of evaluation outputs from all evaluation activities via a Defence evaluation network will accelerate the growth, leveraging and application of knowledge.

Requirement for operational evaluation 1.18

1.18 The requirement for the ADF to conduct OE stems from the ADF’s need to maintain warfighting superiority, regardless of its relative size. ADDP 3.0—Operations describes six joint functions which group related capabilities and activities to help commanders integrate, synchronise and direct joint operations. The joint functions are command and control, intelligence, offensive action, movement and manoeuvre, force protection, and sustainment. These joint functions provide an important framework for quantifying the OE outcomes and identifying, aggregating and analysing OE knowledge that is consistent with capability and preparedness management processes. OE is an essential element of the command and control joint

1–6

Page 19: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

function which provides commanders and staff with a mechanism to effectively and efficiently transform what is known, in the form of information and intellectual assets, into enduring value and improved operational effects. It is through OE that the ADF gains and exploits a knowledge edge in its three integrated elements—information, knowledge and decision making, and consequently maximises its combat power whilst containing casualties, materiel losses and the costs of operations.

1.19 Success in attainment of the knowledge edge will be reflected in the effectiveness and rapidity with which the ADF evaluates its performance on operations, and disseminates and acts upon operational lessons even when under extreme pressure. The effective acquisition, growth and use of knowledge capital is dependent on OE of the full range of ADF activities including theatre-specific force preparation, deployment, operations, sustainment and redeployment, and all activities associated with raising, training and sustaining the ADF. Effective OE activities provide commanders with a tool to acquire a thorough knowledge of and confidence in the ability of their forces to apply optimum combat power.

1.20 In summary, OE provides knowledge for operational commanders and those responsible for generating the force of today and the future. Through effective OE, the ADF acquires the knowledge to optimise the preparedness of the FIB and enhance the capability of the ADF.

While the number of Australian personnel involved in operations was small in proportion to the overall coalition force, our highly trained and well equipped forces contributed significantly to the success of the mission. And we are continuing to provide much needed support as the nation rebuilds. The ADF personnel involved in operations in the Middle East performed their roles with dedication, effectiveness and compassion. Their success on the ground was due in no small part to the ongoing efforts and preparation of military planners and the civilians who support them. It is a credit to the ADF that it was able to make such a contribution while undertaking numerous other deployments, including Operation CITADEL (UN Peace Keeping in East Timor), Operation RELEX (protecting Australia’s northern borders) and Operation SLIPPER (war against terrorism). Whilst the stabilisation and recovery process in Iraq continues, the lessons we have learned during the war, like the lessons we have learned in other operations, will help to enhance our capabilities, protect our servicemen and women and develop our forces for future operations.

Senator the Honourable Robert Hill, Minister for Defence 2003

1–7

Page 20: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

Operational evaluation knowledge 1.21

1.21 The ADF gains its knowledge edge through a range of practices that are conducted to identify, create, represent and distribute knowledge for reuse, awareness and learning across the organisation. Collectively these practices are termed knowledge management, which is the explicit control and management of knowledge aimed at achieving the organisation’s objectives; knowing what we know, capturing and organising our knowledge and using it to produce returns. Knowledge management incorporates those actions which support the collaboration and integration of knowledge, and deals with both tacit and explicit knowledge3. Knowledge management is an emerging Defence capability that is continually evolving4.

1.22 The Defence knowledge domain5 can be thought of as a complex conglomeration of knowledge nodes or domains that will under a mature knowledge management regime be capable of seamlessly distributing and sharing knowledge in order to better achieve Defence objectives. Effective knowledge management brings together under one set of practices various knowledge domains including knowledge capital, organisational learning approaches, enabling practices and enabling technologies. Knowledge management and organisational learning are closely related and largely interdependent yet separate disciplines; knowledge management has a greater focus on the management of knowledge capital and the flow of knowledge, whereas organisational learning is concerned with intellectual development.

1.23 OE provides a conduit between Defences knowledge nodes or domains, and is itself a domain and is closely aligned to operations, capability development and preparedness management knowledge. The OE knowledge domain functions to support current activities and to enable analysts to synthesise knowledge over time to support longer term requirements such as capability development. The OE knowledge domain has the following dimensions:

• knowledge capital obtained from specific activities, including evaluation reports, lessons and recommendations;

3 Tacit knowledge is commonly recognised as that which an individual knows subconsciously or internally, whereas explicit knowledge is that which is consciously known, in the public domain, and readily communicated to others.

4 Knowledge management as a discipline is the subject of ongoing academic debate. Australian Government guidance on knowledge management is contained in the Interim Standard on Knowledge Management, 2003, www.standards.com.au and Knowledge Management Better Practice Checklist, produced by the Australian Government Information Management Office, which is available at www.agimo.gov.au/checklists.

5 The Defence knowledge domain is a concept. Defence knowledge management consists of a number of disparate practices that are coordinated to a limited extent.

1–8

Page 21: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

• leveraged knowledge, which is disseminated, assessed against standards, and subjected to gap analysis; and

• knowledge implemented to enhance capability and improve preparedness.

1.24 It is a characteristic of OE knowledge that the full relevance of knowledge obtained from an activity may not be immediately apparent, or apparent to its primary audience, or may not be apparent until aggregated with other knowledge. One knowledge theory recognises at least two forms of learning known as single and double loop learning. Single loop learning involves identifying the difference between expected and actual outcomes and modifying performance accordingly. Double loop learning involves questioning the fundamentals of the activity such as values assumptions and policies. This theory is useful to explain how OE knowledge may have short, intermediate and long-term applicability when subjected to iterative analysis and learning processes.

1.25 The OE knowledge environment systematically brings together people, processes, experience and technology to facilitate the exchange of operationally relevant information. OE knowledge is managed to provide for collection, recording, analysis, dissemination, use and archival in an open and unrestrained but not unwieldy knowledge management environment. Evaluation knowledge needs to be accessible before, during and after operational activities. For further discussion of the OE knowledge domain, see chapter 8—‘Operational evaluation knowledge domain’.

Capability and preparedness enhancement 1.26

1.26 Capability. Military capability is the power to achieve or influence operational effects, and is a combination of the force structure and preparedness of the FIB. Preparedness is further elaborated as a combination of readiness and sustainability—the ability to undertake military operations6. A prime function of OE is to enhance capability by informing improvements to force structure and assisting with the management of the preparedness of the FIB. OE may also provide valuable input to the capability development process, by contributing to the assessment of the performance of the current force and that expected of a planned force, including the identification and analysis of capability gaps.

1.27 The eight fundamental inputs to capability (FIC) are those tangible inputs considered fundamental to the development and delivery of military capability. OE conducted in terms of FIC provides a comprehensive approach to the enhancement of capability by focusing the attention not only on the

6 For more detail see ADDP 00.2—Preparedness and Mobilisation (Provisional).

1–9

Page 22: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

individual inputs, but the combination and integration of FIC in order to achieve operational outcomes. FIC are categorised and broadly defined in terms of OE as follows:

• Personnel. The focus will be on individual training and the competencies required by individuals to perform the functions of their positions. OE will not focus on the performance of individuals, but will be able to provide feedback on employment groups.

• Organisation. OE can assist commanders to identify the correct force structure to achieve the military effects sought, and to ensure that the command and control is appropriate.

• Collective training. The evaluation of collective training is one of the major roles of OE across combined, joint and single-Service programs. OE will assist commanders to validate their collective training regime against preparedness requirements. The OE focus will be on the performance of collective competencies against predetermined standards.

• Major systems. OE can monitor the performance of major systems over time, and particularly their adaptation to new operational theatres and circumstances. OE will provide an important record of performance to inform the major systems’ capability development process.

• Supplies. OE can assess the adequacy of the range of supplies available (eg quantities, serviceability, and suitability) and the efficiency and effectiveness of the supply system.

• Facilities. A lesser focus for OE, nevertheless matters, such as the suitability of training areas, will be of interest.

• Support. OE can assess aspects of the national support base capabilities within Australia and deployed.

• Command and management. Command and management underpins Defences operating and management environments and therefore is a category of particular OE focus. An important role for OE is the validation of written guidance such as instructions, directions, doctrine and tactical level procedures. OE is a command and control function.

1–10

Page 23: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

1.28 Force structure. Force structure is concerned with generating the FIB required to achieve desired operational effects, and is principally concerned with personnel, equipment, facilities and military doctrine. In the medium to long-term, changes to force structure are generated by the capability development process. OE knowledge needs to be managed and aggregated over time in such a way that it can be applied by the capability development process to adjust force structure as required.

1.29 Preparedness. Preparedness is a measure of the state of the FIB to undertake military operations to achieve the required effects, and includes the components of readiness and sustainability. The maintenance of a large number of force elements (FE) to meet a wide range of potential contingencies is not practical nor an effective use of limited resources. Consequently FE are assigned various levels of capability7 against their operational preparedness objectives (OPO) on the understanding that they will be able to achieve their operational level of capability (OLOC) within a nominated work up period—their readiness notice (RN).

7 The three levels of capability are: OLOC; directed level of capability (DLOC); and minimum level of capability. For details see ADDP 00.2, chapter 1.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Interpersonal networks. Liaison officers placed in United States headquarters contributed significantly to the success of the planning process. Many had trained or served on exchange postings in the United States. Their effectiveness demonstrated the importance of personal relationships for developing and maintaining the levels of trust and interoperability necessary for effective coalition operations.

Defence will review personnel exchange postings to ensure they provide the best opportunities to support relationships with our partners and allies.

1–11

Page 24: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

1.30 The preparedness management system (PMS) provides the mechanism to ensure that the FIB maintains appropriate levels of preparedness, including appropriate support arrangements. The PMS involves four phases:

• development of the preparedness requirement through OPO,

• implementation of the preparedness plan,

• assessing and reporting preparedness achieved against OPO, and

• reviewing the preparedness plan and current guidance.

1.31 OE operates as a system within the PMS, particularly in support of PMS implementation, assessment and reporting activities. It is during implementation that FE will undertake training activities to maintain their DLOC. The role of OE is to assess performance during these training activities and report using the OPO and FIC as a reporting structure. Commanders may wish to allocate OE priorities to new preparedness tasks, OPO performance that has not been assessed for some time, or short RN tasks. The primary tasks of OE in support of preparedness are:

• validating preparedness arrangements and in particular the determination of RN,

• providing commanders with an objective means of monitoring preparedness particularly during collective training,

• validating OLOC plans and confirming the achievement of OLOC when ordered, and

• validating the achievement of OLOC based on performance on operations.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Naval gunfire support. The effectiveness and utility of naval gunfire support was confirmed. It provided accurate and timely support to land forces.

Naval gunfire support remains an important and valuable capability, and must remain part of Navy’s operational training and doctrine.

1–12

Page 25: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

1.32 The Chief of Defence Force Preparedness Directive provides direction and designates responsibilities for preparedness. Service Chiefs and Chief of Joint Operations (CJOPS) are responsible for implementing preparedness, and preparedness requirements are specified in the Joint Operations Command Operational Preparedness Requirement. Preparedness activities are programmed and managed through the Program of Major Service Activities (PMSA). The PMSA is that program used by CJOPS and the Service Chiefs to manage Defences major and/or significant activities to achieve and maintain directed levels of preparedness, meet international engagement objectives, pursue Australia’s military strategy and achieve force development objectives. The PMSA coordinates operations, collective training and other activities such as Defence international engagement commitments. It is through execution of the PMSA that the ADF achieves its directed levels of preparedness. PMSA details are contained within the ADF Activity Management System (ADFAMS), a lotus notes database on the Defence Secret Network.

1.33 OE tasks in support of preparedness are planned on the basis of activity information recorded on ADFAMS, principally through the ADF Activity Analysis Database System (ADFAADS). OE is reliant on ADFAMS for details on each activity such as OPO, tasks, conditions, standards and performance measures8. In effect, ADFAADS operates as the OE component within ADFAMS and as such supports the achievement of preparedness. The PMSA includes combined activities and activities involving other government departments, and it is through the sharing of OE knowledge gained from these activities that OE contributes to interoperability with allies and supports the development of a whole of government approach to operations.

Operational evaluation environment 1.34

1.34 The OE environment includes joint and single-Service FE and all Defence elements that support operations and preparedness. Operations and collective training activities are the most important sources of OE knowledge. Other activities, such as international engagement, trials and experimentation are also subjected to OE. The evaluation principles and procedures discussed in this publication are applicable to all activities and may be adapted as required. The conduct of OE in the joint arena is supported by and connected to the lessons captured and coordinated by single-Services and other Defence groups.

1.35 Operations provide special opportunities to gather knowledge because they are not constrained, structured or scripted in the manner of most collective training activities. OE of operations is planned as part of the

8 Tasks and associated conditions, standards and performance measures are contained within Australian Joint Essential Tasks (ASJETS).

1–13

Page 26: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

normal operational planning process, and OE should commence as early as possible in the life of an operation. Operations may need to be supported by a rapid learning loop that identifies, analyses and resolves issues rapidly.

1.36 Collective training activities are conducted primarily to enable FE to achieve DLOC preparedness objectives. The prime role of OE in support of collective training is to confirm the achievement of DLOC.

1.37 Other ADF activities such as international engagement, trials and experimentation are not primarily preparedness activities. However, OE provides a structure to assess and analyse these activities in accordance with OE priorities.

Related activities 1.38

1.38 Planning processes. OE knowledge provides planning staffs with a valuable resource to assist with their initial planning steps, including joint intelligence preparation of the battlefield and mission analysis. For example, intelligence staffs interrogate OE knowledge for information on the environment and previous similar situations. Planning staffs conducting mission analysis refer to OE knowledge to assist with their review of the situation (based on accounts of previous experience) and the identification of critical facts and assumptions. Planning staffs continue to refer to OE knowledge as planning continues.

1.39 Operational analysis. Operational analysis (OA) is the application of scientific and related methodologies to assist commanders to make operational decisions. Operational analysts are usually provided by the Defence Science and Technology Organisation. OA and OE are similar and complementary activities, with some key differences. OE differs from OA in that it involves the use of qualitative as well as quantitative methods and professional military judgment to reach its conclusions and is conducted by all members of the ADF, whereas OA uses predominantly scientific methods and therefore normally requires the involvement of scientists. OA and OE are mutually supporting activities. For example, OE may identify a task that is suitable for OA, and OA may produce results that should be disseminated, implemented and monitored within the OE process.

1.40 Network Centric Warfare. Network Centric Warfare (NCW)9 is a simple concept that involves linking engagement systems through networks and sharing information between FE. NCW is based on the idea that information is only useful if it allows people to act more effectively. OE has a key role to play in ensuring that information in the form of OE knowledge (in particular lessons from current and previous experience) has been captured,

9 Refer to ADDP–D.3.1—Enabling Future Joint Warfighting Network Centric Warfare.

1–14

Page 27: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 1

analysed, implemented and remains accessible to members of the ADF deployed on operations via the NCW information domain. A high priority task during OE is to rapidly identify, analyse and publish lessons so that they can be shared with other deployed FE and the wider ADF via an appropriately linked network.

1.41 Effects based approach. The concept of an effects based approach10

involves applying military power so that national objectives are achieved in the most effective way. It considers the warfighting capabilities of the ADF as a suite of tools that can be applied to crisis situations to achieve effects that accord with the whole of government approach. OE knowledge is accessed at the strategic and operational levels to inform effects based assessments and analysis of crisis situations. For example, OE knowledge assists planners to assess the effects achievable by force elements in a range of environments.

Annex:A. Evaluation forms and approaches

10 The effects based approach is detailed in ADDP 3.0.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Networking and connectivity. Networked military operations contributed to Coalition success with shared information, intelligence and situational knowledge identified as crucial success factors.

Defence will continue developing its NCW capacity through training, doctrine, equipment acquisition and capability enhancements.

1–15

Page 28: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 29: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

Annex A to ADDP 00.4 Chapter 1

EVALUATION FORMS AND APPROACHES1A

1 A

dapt

ed fr

om P

rogr

am E

valu

atio

n Fo

rms

and

App

roac

hes,

3rd

Edi

tion,

Joh

n M

. Ow

en, 2

006,

pub

lishe

d by

Alle

n &

Unw

in.

Form

Proa

ctiv

e (e

valu

atio

n fo

r de

velo

pmen

t)

Cla

rific

ativ

e (d

esig

n ev

alua

tion)

Inte

ract

ive

(pro

cess

ev

alua

tion)

Mon

itorin

g (p

rogr

am

man

agem

ent

eval

uatio

n)

Impa

ct

Purp

ose

To p

rovi

de in

form

atio

n to

ass

ist d

ecis

ions

ab

out a

pro

pose

d pr

ogra

m, p

olic

y,

proc

edur

e, s

truct

ure

etc

To c

larif

y th

e st

ruct

ure,

ope

ratio

n or

de

liver

y of

a p

rogr

am,

polic

y, s

et o

f pr

oced

ures

, or

gani

satio

n et

c

To p

rovi

de

info

rmat

ion

abou

t im

plem

enta

tion,

op

erat

ion

or d

eliv

ery

with

a v

iew

to

impr

ovem

ent

To a

sses

s pr

oces

ses

and

outc

omes

with

a

view

to fi

netu

ning

and

ac

coun

ting

for

reso

urce

usa

ge

To a

sses

s th

e im

pact

or o

utco

mes

of a

pr

ogra

m.

Tim

ing

•B

efor

e de

velo

pmen

t (fr

ont e

nd/

diag

nost

ic)

•D

urin

g im

plem

enta

tion

or o

pera

tion

(form

ativ

e)

•D

urin

g im

plem

enta

tion

or o

pera

tion

(form

ativ

e)

•D

urin

g im

plem

enta

tion

or o

pera

tion

(form

ativ

e)

•A

fter o

r at a

m

atur

e st

age

of

impl

emen

tatio

n or

ope

ratio

n (s

umm

ativ

e/

impa

ct)

1A–1

Page 30: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
ADDP 00.4

Typi

cal

issu

es•

Is th

ere

a ne

ed

for t

he

prog

ram

/pol

icy?

•W

hat i

s be

st

prac

tice

in th

is

area

?

•W

hat d

oes

the

rese

arch

lit

erat

ure

tell

us

abou

t it?

•W

hat i

s th

e ra

tiona

le?

•W

hat a

re th

e in

tend

ed

outc

omes

and

ho

w a

re th

ey to

be

ach

ieve

d?

•Is

it p

laus

ible

an

d fe

asib

le?

•W

hat n

eeds

to

be m

odifi

ed to

m

axim

ise

inte

nded

ou

tcom

es?

•H

ow is

it g

oing

?

•Is

it o

pera

ting

acco

rdin

g to

pl

an?

•H

ow c

ould

it b

e ch

ange

d to

m

ake

it m

ore

effe

ctiv

e?

•A

re d

efin

ed

benc

hmar

ks

bein

g re

ache

d?

•H

ow is

im

plem

enta

tion

goin

g on

di

ffere

nt s

ites?

•A

re c

osts

risi

ng

or fa

lling

?

•H

ow is

im

plem

enta

tion

goin

g no

w

com

pare

d w

ith

the

past

?

•H

ow c

an th

e pr

ogra

m b

e fin

etun

ed to

m

ake

it m

ore

effe

ctiv

e?

•H

as it

bee

n im

plem

ente

d as

pla

nned

?

•H

ave

its g

oals

an

d ob

ject

ives

be

en

achi

eved

?

•H

ave

the

need

s of

st

akeh

olde

rs

been

met

?

•W

hat a

re th

e un

inte

nded

ou

tcom

es?

•H

as it

bee

n co

st e

ffect

ive?

Form

Proa

ctiv

e (e

valu

atio

n fo

r de

velo

pmen

t)

Cla

rific

ativ

e (d

esig

n ev

alua

tion)

Inte

ract

ive

(pro

cess

ev

alua

tion)

Mon

itorin

g (p

rogr

am

man

agem

ent

eval

uatio

n)

Impa

ct

1A–2

Page 31: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Key

ap

proa

ches

•ne

eds

asse

ssm

ent

•re

view

of

rese

arch

•re

view

of b

est

prac

tice/

crea

tion

of b

ench

mar

ks

•pr

ogra

m

logi

c/th

eory

de

velo

pmen

t

•fe

asib

ility

stu

dy

•ac

cred

itatio

n

•re

spon

sive

ev

alua

tion

•ac

tion

rese

arch

•qu

ality

revi

ew

•de

velo

pmen

tal

eval

uatio

n

•em

pow

erm

ent

eval

uatio

n

•co

mpo

nent

an

alys

is

•pe

rform

ance

as

sess

men

t

•sy

stem

s an

alys

is

•ob

ject

ives

ba

sed

eval

uatio

n

•pr

oces

s ou

tcom

e st

udie

s

•ne

eds

base

d ev

alua

tion

•go

al fr

ee

eval

uatio

n

•pe

rform

ance

au

dit

Ass

embl

y of

evi

denc

eR

evie

w o

f doc

umen

ts,

site

vis

its a

nd fo

cus

grou

ps

Doc

umen

t ana

lysi

s,

inte

rvie

ws

and

obse

rvat

ion

On-

site

stu

dies

Site

vis

its, a

cces

s to

in

form

atio

n sy

stem

s,

use

of p

erfo

rman

ce

indi

cato

rs

Use

of

pred

eter

min

ed

rese

arch

crit

eria

, ob

serv

atio

n re

cord

s,

quan

titat

ive

and

qual

itativ

e da

ta

Form

Proa

ctiv

e (e

valu

atio

n fo

r de

velo

pmen

t)

Cla

rific

ativ

e (d

esig

n ev

alua

tion)

Inte

ract

ive

(pro

cess

ev

alua

tion)

Mon

itorin

g (p

rogr

am

man

agem

ent

eval

uatio

n)

Impa

ct

1A–3

Page 32: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 33: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 2

CHAPTER 2

RESPONSIBILITIES 2

Introduction 2.1

2.1 The Defence management framework is based on cascading accountability for outputs that contribute to the Defence mission—the defence of Australia and its national interests. To achieve this mission Defence develops military capabilities that provide Government with the option to apply military power when required. Military capability provides the power to achieve certain operational effects. Defence capability encompasses force structure and preparedness, and is managed through a series of documents that include Australia’s Military Strategy, the Chief of the Defence Force Preparedness Directive and the Joint Operations Command Operational Preparedness Requirement (JOCOPR). Commanders and managers at all levels are required to plan, apply, measure, monitor and evaluate those elements of capability for which they are responsible. Ultimately, it is command accountability and responsibility for capability outputs that drives the requirement for OE.

Executive summary

• Responsibilities for operational evaluation (OE) stem from the accountability that all commanders have for the capability outputs of their command.

• All commanders are responsible for evaluating the performance of force elements (FE) under their command during all activities in which they participate.

• OE is initiated at the highest level of command associated with an activity.

• OE planning is conducted by the headquarters planning the activity and by an OE team if established.

• The conduct of OE is usually the responsibility of the officer who commands the area in which the activity is occurring.

• Responsibilities for OE implementation, monitoring and review are widespread and require oversight at the highest levels of command.

2–1

Page 34: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 2

2.2 Specific responsibilities for conducting OE are contained in the JOCOPR. The JOCOPR includes the requirements for FE to be prepared to perform a role and achieve an operational outcome that meets the requirements of various military response options. In this context, the focus for OE is not only on the ability of FE to conduct combat operations but also encompasses the ability of the force to be sustained, an aspect of capability that is often overlooked during collective training. Notwithstanding particular OE activities, commanders are expected to exercise their professional military judgement in relation to assessing the preparedness of FE under their command.

2.3 Commanders are required to conduct OE in accordance with the ADF’s evaluation system that requires all joint activities to be evaluated and reported via the Australian Defence Force Activity Analysis Database System (ADFAADS). ADF operational evaluation policy is contained in Defence Instruction (General) OPS 41–1—Australian Defence Force Activity Analysis Database System. The system addresses routine internal OE requirements, conducted by activity participants and OE teams established by the local commander. Additional requirements that involve external OE teams are directed by higher commanders as required.

Responsibilities 2.4

2.4 The Chief of Joint Operations (CJOPS) is responsible for OE policy and for issuing guidance to commanders on specific OE requirements. This guidance is normally provided in the JOCOPR. CJOPS is assisted by the Commandant and OE staff at the Australian Defence Force Warfare Centre (ADFWC).

2.5 Responsibilities for OE are specified for each phase in the OE model—identifying requirements, planning, conduct, and implementation, monitoring and reviewing. See chapter 3—‘Operational evaluation

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Unmanned aerial vehicles. Unmanned aerial vehicles (UAVs) were force multipliers through all phases of combat operations.

The Australian Government Department of Defence intends to move quickly to develop and integrate UAV capabilities into Australian Defence Force (ADF) doctrine, planning and operations.

2–2

Page 35: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 2

framework’ for a detailed description of the OE model. The officer initiating the requirement for OE will usually delegate responsibility for the other phases, but retain overall responsibility for all phases.

2.6 Initiating requirements. Officers responsible for initiating OE requirements are responsible for issuing an initiating instruction (see chapter 4—‘Identifying the requirement’ for details). Responsibilities are as follows:

• Operations. CJOPS is responsible for authorising OE activities conducted within an operational theatre. CJOPS may also initiate the OE requirement, or delegate that responsibility to the Deputy Chief of Joint Operations (DCJOPS) or another commander.

• Collective training.1 The officer scheduling the exercise (OSE) is responsible for initiating OE requirements.

• Other activities. For all other activities the officers scheduling those activities2 are responsible for initiating OE as required.

2.7 Planning. The initiating instruction will nominate the officer or agency responsible for planning OE (see chapter 5—‘Planning’ for details). For internal OE, planning is done by the lead planning headquarters, during the course of planning the activity. When an OE team is to be established, the OE team leader is assigned responsibility for planning by the officer initiating the requirement.

2.8 Conduct. The initiating instruction will nominate the officer responsible for the conduct of OE (see chapter 6—‘Conduct’ for details). Responsibilities are as follows:

• Operations. Responsibility for conduct of internal OE resides with commanders of each FE assigned to and supporting operations. Responsibility for OE conducted by external OE teams may be delegated to the theatre commander, joint task force/group commander or national commander. Technical responsibility for conduct will remain with OE team leaders when appointed. Perceptions of objectivity and the requirement for collaboration in theatre need to be taken into account when selecting the officer responsible for conduct.

1 See Australian Defence Force Publication 7.0.3—Exercise Planning and Conduct for further information on the responsibilities of the OSE.

2 Details of scheduling officers for collective training and other activities are recorded on activity pages on the Australian Defence Force Activity Management System.

2–3

Page 36: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 2

• Collective training. Responsibility for conduct of internal OE resides with commanders of each FE assigned to and supporting an exercise. Responsibility for OE conducted by OE teams may be delegated to the officer conducting the exercise or the exercise director. Technical responsibility for conduct will remain with the OE team leader.

• Other activities. For all other activities the officer responsible for conducting the activity is responsible for OE conduct.

2.9 Implement, monitor and review. The responsibilities for this phase of the OE model are more widespread than other phases of the model (see chapter 7—‘Implementation, monitoring and reviewing’ for details).

• Implementation. The overall responsibility for implementation of specific OE recommendations resides with the officer who initiated the OE requirement. This responsibility may be delegated to the officer responsible for conduct of the OE or other subordinate commander(s). Action organisations and officers responsible for specific OE recommendations are nominated in the recommendation, summarised in the activity report and recorded on the ADFAADS issue resolution form. See chapter 10—‘Australian Defence Force operational evaluation system’ for details of ADFAADS. Commanders are responsible for ensuring that actions assigned to their organisation have been completed. Longer term implementation responsibilities are less well defined. In general terms, officers responsible for doctrine, training, preparedness and other aspects of capability are responsible for implementing the results of their analysis of OE knowledge. Higher level managers are responsible for ensuring that OE knowledge is fully exploited.

• Monitoring. Monitoring performance and the impact of changes arising from OE recommendations is the responsibility of commanders at all levels. Commanders may wish to task OE teams to assist with this monitoring task.

• Reviewing. Officers responsible for Defence programs are responsible for reviewing their programs and the impact of OE on their programs.

2.10 Operational evaluation training. DCJOPS is responsible for analysing the requirement for OE training and evaluating OE training. The Commandant ADFWC is responsible for OE training design, development and conduct.

2–4

Page 37: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 2

Operational evaluation organisations and tools 2.11

2.11 In the absence of extensive OE resources, commanders are heavily reliant on internal evaluation and their own professional military judgement to meet their OE responsibilities. Some of the organisations and tools available to support commanders are as follows:

• Australian Defence Force Activity Analysis Database System.ADFAADS is a semi-automated system that supports all phases of the OE model. ADFAADS enables the ADF to record, analyse and disseminate OE data within a structured database environment (see chapter 10).

• Australian Defence Force Warfare Centre evaluation staff. The ADFWC evaluation staff facilitates OE throughout the ADF by supporting internal evaluation activities, providing OE teams, assisting with implementation of approved OE recommendations, providing specialist OE advice and standard operating procedures, and managing ADFAADS.

• Maritime Operational Analysis Centre. The Maritime Operational Analysis Centre (MOAC) mission is to improve the operational effectiveness of the Royal Australian Navy’s (RAN) fleet-in-being by measuring current capability, to provide objective evidence upon which to base informed decisions at many levels of Navy capability. Deficiencies revealed during the measurement of capability can lead

With the redeployment of forces to Australia, the work of evaluating the performance of Defence, reconstituting forces, and returning our forces to operational readiness began. The high level of ADF commitments—some 3800 ADF personnel deployed on 10 operations around the world in July 2003—meant that returning forces had to be restored to operational readiness as soon as practicable. In a tribute to its professionalism, the ADF managed to reconstitute its returned forces quickly while responding to new challenges and additional operational commitments. One such additional commitment began on 24 July 2003, when an ADF contingent of about 1500 personnel, ships and aircraft, together with Australian Federal Police and Australian Protective Service personnel, provided regional assistance in a mission to the Solomon Islands in Operation HELPEM FREN. In preparation for future eventualities, a whole-of-Defence review was undertaken so lessons could be learned from our involvement and from the experiences of our Coalition partners.

The War in Iraq, ADF operations in the Middle East in 2003

2–5

Page 38: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 2

to tactics improvements, identification of training shortfalls and/or proposals for future equipment fixes. To achieve its mission, MOAC applies independent and objective operational analysis to:

– support RAN tactical development;

– undertake critical issues investigation, including operational analysis assistance for operations;

– support planning and analysis of operations and exercises; and

– measure and define capability, which can be used for capability development of the fleet-in-being and enhanced fleet.

• Centre for Army Lessons. The mission of the Centre for Army Lessons (CAL) is to collect, analyse, store and disseminate Army lessons in order to enhance war fighting capability. CAL collects and analyses lessons in order to identify current capability gaps, support learning and doctrine development and stimulate the production of future land warfighting concepts. CAL’s tasks include:

– facilitating learning from lessons;

– undertaking qualitative analysis;

– collating Army lessons and related information; and

– publishing a range of lessons-orientated publications.

• Headquarters Air Command evaluation staff. Headquarters Air Command (HQAC) capability evaluation staff is responsible for providing Air Commander Australia with advice on OE, developing and maintaining OE orders and instructions, managing ADFAADS on behalf of HQAC and providing ADFAADS training services to the Royal Australian Air Force.

2–6

Page 39: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

CHAPTER 3

OPERATIONAL EVALUATION FRAMEWORK 3

Introduction 3.1

3.1 OE has been derived from well established civilian evaluation theory and practice, as described in chapter 1—‘Introduction to operational evaluation’. However, civilian evaluation arrangements do not entirely suit OE circumstances in the Australian Defence Force (ADF), and therefore a modified framework has been adopted. The OE framework includes a set of standards and a process model. This framework is intended to codify OE without being overly prescriptive and slavishly adhered to by evaluators. Leveraging1 from OE knowledge capital and functioning within a collaborative environment are also particular characteristics of OE discussed in this chapter.

Standards 3.2

3.2 Commonly accepted civilian evaluation standards are grouped within four areas—accuracy feasibility, propriety and utility, and ADF OE standards are a modified version of these standards. The major variation is in relation to propriety or ethical standards. The ADF has a code of ethics that applies to the entire organisation and the requirement for ethical conduct is an inherent expectation in all ADF activity. The ADF OE standards are grouped under the following headings:

• accuracy,

• feasibility,

Executive summary

• Operational evaluation (OE) standards address accuracy, feasibility, accountability and utility issues. Standards ensure that OE activity is sound and defensible, and provide a framework for reviewing OE practice.

• The OE model provides a transparent system for the conduct of OE activities and ongoing monitoring and review of actions taken.

• OE knowledge is leveraged to inform capability and preparedness enhancement.

• An open, collaborative environment best enhances the learning obtained from OE.

1 Leveraging refers to the advantage that can be gained by exploiting knowledge capital.

3–1

Page 40: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

• accountability, and

• utility.

3.3 Accuracy. Accuracy standards determine whether OE has produced a truthful outcome, whether the information provided is valid and credible, and whether conclusions and recommendations are logical and reflect only the evidence obtained. Evidence is obtained via impartial, verifiable, transparent, and endorsed methods. Transparent OE has a higher chance of being accurate because stakeholders will be able to scrutinise the plan, monitor its implementation and query any aspects that appear to lack accuracy. Accuracy standards address the following issues:

• Is the OE task defined clearly and accurately?

• Does the OE plan accurately reflect the task?

• Are the sources of information comprehensive and therefore more likely to lead to accurate conclusions?

• Are the methods of gathering information appropriate such that they will lead to reliable outcomes?

• Is information gathered being progressively and systematically reviewed to verify accuracy and completeness?

• Have conclusions been explicitly explained and related back to evidence obtained?

• Does reporting reflect only the evaluation findings and is it devoid of bias?

3.4 Feasibility. Feasibility standards relate to the balance between the OE outcome and the evaluation effort required to achieve the outcome. OE must be adequately resourced to achieve expected outcomes, and limited OE resources need to receive tailored tasking. Under-resourced OE will produce incomplete outcomes that may lack credibility. The onus is on evaluators to declare their limitations rather than feeding unreal expectations. Feasibility standards address the following issues:

• Can the OE be conducted without undue disruption to the subject of OE?

• Are all relevant OE subject groups going to be accessible and cooperative?

• Are appropriately trained end experienced evaluators available to be assigned to the OE task?

3–2

Page 41: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

• Can the outcomes sought be achieved with the OE resources available?

• Do the OE resources required for the task represent value for money?

• Is the OE task of appropriate priority?

• Are the evaluators competent to perform the task?

• Are the reporting mechanisms and milestones going to provide timely information?

3.5 Accountability. The conduct of OE is a command responsibility and therefore commanders are accountable for all aspects of OE. Evaluators provide a service to commanders and do not conduct OE on their own behalf or to meet their own objectives. Command responsibility is assigned to all phases of OE. Accountability standards address the following issues:

• Has the plan been endorsed by the commissioning commander?

• Has the OE been commissioned by a commander with appropriate authority and accountability for the conduct and outcomes?

• Is the OE conduct being monitored by a team commander and is the commissioning commander being progressively back briefed?

• Have appropriate commanders been involved in decision making related to OE recommendations?

• Have appropriate commanders and staff been identified to manage implementation of outcomes?

3.6 Utility. Utility standards are concerned with ensuring that OE is informative, timely and influential. OE uses scarce resources and therefore OE tasking needs to be prioritised to target subjects where a higher return is likely. Timeliness is critical where ongoing operations are concerned. The utility of OE may not always be immediately apparent, so a degree of judgement may need to be applied in some cases where utility is questioned. Utility and feasibility are related standards in that OE with reduced feasibility is likely to be of reduced utility. Utility standards address the following issues:

• Have all appropriate stakeholders been engaged in planning and specified their needs?

• Is the logic used to arrive at findings sound?

• Is the OE likely to provide useful lessons?

3–3

Page 42: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

• Are reports clear, comprehensive and timely?

• Have the wider implications of the findings been identified?

3.7 Using standards. The standards can be used as a guide or check list by commanders and evaluators to ensure their practices are sound and defensible. OE outcomes are not always welcomed, and the provision of accurate yet controversial information can lead to dispute between evaluators, OE subjects and stakeholders, so the defensibility of OE is important. The standards can also be used as a framework for reviewing OE practice, either as a formal quality assurance check or informally as a self-evaluation activity conducted by evaluators. This evaluation of an OE task is known as meta-evaluation, and is discussed in more detail in chapter 9—‘Meta–evaluation’.

Operational evaluation model 3.8

3.8 Despite the variations in forms and approaches (see chapter 1) OE will generally follow a standard process, as represented in the model in figure 3–1. The model illustrates a closed loop system that uses the products of previous iterations to inform future iterations. The phases are sequential. Identifying requirements, planning and conduct will usually be linked and limited in time, whereas the implement, monitor and review phase is ongoing activity not limited in time or to one OE activity/event.

3.9 Identify the requirement. The broad requirement to evaluate is discussed in chapter 1. The identification of specific OE requirements is a command responsibility. Command direction is required because of the resources involved, the agreement, collaboration and cooperation needed from the subject of OE. Specific requirements are initiated with an instruction, directive or terms of reference. Typical generic issues that may lead to an OE being initiated are listed in chapter 1, annex A. Specific requirements may arise from various circumstances, such as:

• the introduction to service of a new capability;

• adoption of a new tactic, technique or procedure;

• deployment to a new operational theatre;

• exercising a high readiness capability;

• employment of a new warfighting concept; and

• issues identified as part of the OE monitor and review activity.

3–4

Page 43: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

Figure 3–1: Operational evaluation model

3.10 Identifying the requirement is discussed in more detail in chapter 4—‘Identifying the requirement’.

3.11 Plan. The planning phase commences on receipt of an initiating instruction. The first planning step is to review the initiating instruction and research relevant OE knowledge. Stakeholder input is obtained, focusing on their particular needs. Detailed planning involves identifying the appropriate form and approach, selecting the most appropriate data collection methods and sources, plotting the task timeline and allocating resources to OE tasks. Noting that activity participants are required to routinely conduct OE within their own resources, the plan needs to determine whether an OE team, drawn from internal or external sources, is to be established (unless directed in the initiating instruction). The initiating commander and appropriate stakeholders are back briefed on the plan. Planning is discussed in more detail in chapter 5—‘Planning’.

3.12 Conduct. The conduct phase involves implementing an OE plan. During the activity, collection of evidence and preliminary analysis occurs concurrently so that data can be checked for accuracy and completeness and

3–5

Page 44: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

any unforseen trends can be identified and followed up. Preliminary and final reporting is also part of this phase, and may include progressively released reports, interim reports dealing with specific issues that may need early action, draft reports circulated to stakeholders for their concurrence, and final reports to the initiating authority.

3.13 OE findings from the conduct phase will be recorded and reported in terms of evidence, conclusions, recommendations and lessons, as follows:

• Evidence. Evidence is the data and other information that has been collected during the OE. It may consist of qualitative data (for example records of interviews or observation sheets) or quantitative data (for example records of consumption rates or casualty rates).

• Conclusions. Conclusions are the synthesis of data and information into meaningful statements about the subject.

• Recommendations. Recommendations are derived from conclusions, are supported by evidence, and identify specific actions to be undertaken to effect change. The resource implications of recommendations are to be identified.

• Lessons. A lesson is what has been learned from experience that may be relevant to others. A lesson is explicit knowledge. The Macquarie Dictionary defines a lesson as ‘a useful or salutary piece of practical wisdom imparted or learned’ and ‘something from which one learns or should learn, as an instructive or warning example’.

3.14 Conduct is discussed in more detail in chapter 6—‘Conduct’.

3.15 Implement, monitor and review. The implement, monitor and review phase is a challenging phase because it calls for ongoing effort by commanders and their staff to manage change, leverage knowledge and apply knowledge. Recommendations to be implemented require the assignment of action officers, resources, time frames and reporting schedules, and need to be tracked by an appropriate agency. Monitoring follows up on implementation to ensure that change achieves the outcome sought, and identifies any unintended consequences. The purpose of reviewing is to take an aggregated look at change arising from OE and the accumulation of knowledge to confirm that the knowledge edge is being achieved and maintained. For example, a review may focus on one or more force element, joint function or fundamental input to capability, or look at specific programs or concepts. Outcomes of the implement, monitor and review phase may generate new OE requirements. The implement, monitor and review phase is discussed in more detail in chapter 7—‘Implementation, monitoring and reviewing’.

3–6

Page 45: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

3.16 The practical application of the OE model to the current OE system is described in chapter 10—‘Australian Defence Force operational evaluation system’.

Leveraging knowledge capital 3.17

3.17 It is inherent in the OE model that knowledge obtained from various OE activities can be leveraged through the implement, monitor and review phase in order to bring about improvements in capability and preparedness and attainment of the knowledge edge. In practice this process of learning cannot be effectively achieved simply through osmosis. The leveraging and management of knowledge needs to be supported by systematic processes that convert knowledge capital into knowledge applied to capability and preparedness enhancement. To be fully effective, the OE knowledge management system needs to operate in concert with the wider Defence knowledge management environment.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Force protection. Force protection remains a most important factor in modern combat operations and requires effective operational planning and use of all available measures to reduce the risk of casualties. The ADF used various measures to improve force protection, including armoured vehicles, body armour, preventative health counter measures, close air defence on Royal Australian Navy ships, electronic warfare self-protection equipment on Royal Australian Air Force aircraft, and electronic systems indicating the location of friendly forces. Coalition partners experience reinforced the importance of effective force protection, especially in urban fighting.

Defence will continue to incorporate appropriate force protection measures in operational planning and capability procurement decisions. This will include the appropriate balance of physical and electronic force protection measures, including the use of armour, weapons systems, defensive sensors and a fully integrated battlespace management system to help forces survive and achieve their mission.

3–7

Page 46: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 3

Operational evaluation culture 3.18

3.18 The development of an OE culture is fundamental to the growth of the ADF as a learning organisation, and the two concepts are inextricably linked. A learning organisation sees itself as a producer as well as a user and transmitter of knowledge and OE is an integral component of this activity. An open, transparent, integrated and collaborative OE environment will promote the Defence-wide adoption of an OE culture. In such an environment the following will occur:

• knowledge capital will be freely accessed, shared, leveraged and applied at all levels;

• the value of intellectual assets invested in analysis at all levels will be realised in the form of capability improvements;

• decision making processes will routinely draw on knowledge capital;

• OE will be tasked to inform important decisions;

• OE will be recognised as an activity that makes a valuable contribution to capability and preparedness;

• OE will be highly interactive with stakeholders and subjects such that they are participants and not just observers in the process, and take ownership of the outcomes; and

• OE will be valued for the quality of its findings. In a collaborative environment there will be top-down and bottom-up monitoring and facilitation of knowledge related activities.

3–8

Page 47: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 4

CHAPTER 4

IDENTIFYING THE REQUIREMENT 4

Introduction 4.1

4.1 This chapter addresses the deliberate process of identifying an OE requirement and initiating an OE task, which is the first phase of the OE model (see figure 4–1). Commanders and their staffs should be constantly on the lookout for evaluation tasks because it is through the OE process that the opportunity to learn and build knowledge capital in a structured manner can be optimised. OE involves scarce resources, requires careful planning, is potentially intrusive and so command authority is required to agree to an OE requirement and initiate an OE task. This does not preclude anyone from proposing an OE task.

Requirements 4.2

4.2 The requirement to evaluate is derived from the Australian Defence Force’s (ADF) need to enhance capability and preparedness. This leads to pursuit of the knowledge edge, which requires the ADF to become a learning organisation. Effective OE is fundamental to military learning. The notion of continuous improvement is consistent with this approach to learning, and a feature of continuous improvement is learning from past lessons.

Executive summary

• Specific requirements can arise from a range of operational, preparedness and capability related circumstances.

• Operational evaluation (OE) requirements are specified in detail to ensure a worthwhile return from the OE effort. An initiating instruction format details the essential information to be addressed.

4–1

Page 48: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 4

Figure 4–1: Operational evaluation model—identifying the requirement

4.3 Specific OE requirements are routinely identified by commanders and staff. OE requirements may arise from various sources related to operations, preparedness activities and capability enhancement.

4.4 Operations. Operations provide important opportunities to learn about ADF performance in particular environments, and to assess performance under operational conditions. OE tasking needs to recognise the various limitations that may apply to gathering evidence in theatre, such as limited access to personnel, locations and operational events. Overly intrusive OE activities need to be avoided. The full support of operational commanders for the OE activity should be established early in the planning process.

4–2

Page 49: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 4

4.5 Preparedness. Evaluating preparedness is an ongoing OE requirement as part of the preparedness management system (PMS)1. Commanders responsible for preparedness are required to program training activities to ensure that the directed level of capability (DLOC) is maintained. Routine assessments of DLOC are required and commanders will generally use their professional military judgement to make such routine assessments. Commanders can initiate an OE task to review high priority aspects of preparedness.

4.6 Capability enhancement. To make a positive contribution to capability enhancement OE tasking should focus on specific capability aspects. Preparedness aspects of capability are addressed within the PMS. Other capability related OE activities can look at force structure issues. The fundamental inputs to capability (FIC) provide a useful construct for enunciating specific OE capability related tasks and reporting findings.

Australia’s contribution to the success of major combat operations highlighted areas of performance—such as the employment of modern precision weapon systems—that were very effective and should be sustained for future operations. The evaluation also noted areas that were effective, but needed improvement. These areas include:

• aspects of planning for operations,

• managing rapid equipment acquisition,

• policy development, and

• communication support and information management.

There were also areas that needed to be addressed to improve our performance on future operations. These included force protection and ways to sustain our forces once deployed.

The lessons learned by Defence from the ADF’s operations in Iraq continue to influence Government decisions on capability acquisition, support and development. Application of these lessons will ensure that our forces continue to be well trained, equipped and led to defend Australia and its national interests.

The War in Iraq, ADF Operations in the Middle East in 2003

1 For details of the preparedness management system see Australian Defence Doctrine Publication 00.2—Preparedness and Mobilisation (Provisional).

4–3

Page 50: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 4

4.7 The following list provides some examples of specific OE requirements:

• potential operational situations, and the type of force required;

• emerging operational concepts and their applicability to the ADF;

• ADF deployments, particularly to new and unfamiliar operational theatres;

• training activities in support of high priority preparedness requirements;

• the fielding of new capabilities;

• capability trials and experiments; and

• the exercising of new warfighting concepts, tactics, techniques or procedures.

Initiating operational evaluation requirements 4.8

4.8 OE will be most effective when provided clear direction on the purpose and end state sought and the task has been planned such that it employs suitable approaches2 and assembles the right evidence. OE tasks are initiated by a commander who has authority over the OE subject and can allocate sufficient resources to the task.

4.9 An OE task is initiated with an instruction or terms of reference that provides the commander’s intent and specific direction in relation to the task. Apart from directing the evaluators, the instruction advises the OE subject and stakeholders of the task, and therefore is distributed widely. The instruction contains sufficient information to enable the OE task to be planned in detail, and provides a reference point for reviewing the findings. The essential elements of an OE instruction are as follows:

• What is the origin of the requirement? Sufficient background needs to be provided to set the context for the task and explain its purpose.

• What is to be evaluated? The OE subject is specified in detail, be it a policy, tactic, force element, process, joint function, FIC aspect and so on.

• What is the end state sought? Understanding the end state by all concerned will help establish a cooperative environment for OE. The knowledge expected or anticipated from OE is to be specified.

2 See chapter 1, annex A for information on OE’s various approaches.

4–4

Page 51: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 4

• Who are the stakeholders? The stakeholders are to be consulted during planning and may be engaged during conduct of the OE.

• What are the essential information requirements? The particular areas that are going to be the subject of scrutiny need to be known. This will help shape the plan.

• What reports are required, in what form and when? The reporting regime will shape the evidence collection and analysis plan developed by the evaluators and indicate when information can be expected.

• What are the command and control arrangements? The lines of command and control need to be explicit so that the evaluators have appropriate freedom of manoeuvre and there is no unnecessary disruption to their work. The initiating commander retains overall authority over the OE task. The OE team leader (and team members if known) are identified and given authority to liaise directly with the OE subject and other stakeholders.

4.10 An outline format for an OE initiating instruction is in annex A.

Annex:A. Operational evaluation initiating instruction—outline format

4–5

Page 52: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 53: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

Annex A to ADDP 00.4 Chapter 4

OPERATIONAL EVALUATION INITIATING INSTRUCTION—OUTLINE FORMAT A

Background and context 1

1. What led to the operational evaluation (OE) requirement?

2. What are the references?

3. What is to be evaluated?

4. Who are the stakeholders?

Commander’s intent 5

5. What is the purpose of the OE?

6. What method is to be employed?

7. What is the end state sought through conduct of the OE?

Objectives 8

8. What are the high level objectives?

9. What are the commander’s essential information requirements?

Scope 10

10. What geographic and other limitations apply to the OE?

11. What is the time frame allowed for planning and conduct of the OE? What are the milestones?

12. What issues may affect the performance of the OE task?

Products 13

13. What reports are required, when and in what form?

Approaches 14

14. What approaches to the OE are anticipated?

15. What forms of evidence are anticipated?

4A–1

Page 54: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Operational evaluation team 16

16. Who is the team leader?

17. What is the team’s composition and from where is the team to be drawn?

18. What skills is the team to possess?

19. What are the team’s responsibilities?

Implementation arrangements 20

20. What are the key timings?

21. What are the command and control arrangements?

4A–2

Page 55: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

CHAPTER 5

PLANNING 5

Introduction 5.1

5.1 Planning for OE is a dynamic process that builds on the initiating instruction and engages stakeholders in refining the task. Planning is the second phase in the OE model (see figure 5–1). The dynamic nature of evaluation is such that plans are not overly restrictive. Evaluators need to be attuned to changing or unforeseen circumstances and be prepared to seek approval to modify the plan accordingly.

5.2 OE planning occurs as an integral part of planning for operations, collective training and other activities, and is rarely planned in isolation. OE staffs are included in activity planning groups from the outset, and advantage should be taken of their research skills in identifying relevant lessons and other information.

5.3 Australian Defence Force operational evaluation system. If a specific OE task has not been initiated for an activity, evaluation will still proceed on the basis of the ADF’s OE system. The Australian Defence Force Activity Analysis Database System (ADFAADS) is the core element of a standing plan within the ADF to identify and record lessons arising from ADF activities, facilitate analysis of issues associated with those lessons, generate action to resolve the issues and provide a medium for post activity reports to be recorded. ADFAADS functions as a knowledge repository for OE information, with the added benefit of being an interactive database that automatically progresses and monitors issues through to their resolution via a series of automatically generated email messages. Further details are provided in chapter 10—‘Australian Defence Force operational evaluation system’.

Executive summary

• Operational evaluation (OE) planning is integral to planning Australian Defence Force (ADF) activities.

• The OE planning process follows the basic steps of the joint military appreciation process (JMAP).

• Consultation and negotiation between evaluators and stakeholders are key features of planning.

• OE plans provide a clear statement of the OE purpose, method and end state.

5–1

Page 56: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

Figure 5–1: Operational evaluation model—planning

Planning 5.4

5.4 OE planning is undertaken by headquarters OE staffs and/or OE teams when established. Whilst much of the technical detail can be prepared separately, OE plans are the result of consultation, collaboration and negotiation between evaluators and stakeholders. It is during the planning phase that evaluators establish a strong relationship with the OE subject and ensure their commitment to and support for the OE. Information requirements, sources and methods of collecting evidence need to be discussed and agreed before the OE can proceed. For this reason evaluators need to have strong interpersonal and negotiation skills. Evaluators may also take the opportunity to sell the benefits of OE to stakeholders not well versed in the discipline.

5.5 OE plans should provide sufficient details to expand on the initiating brief and indicate the following:

• What are the objectives?

5–2

Page 57: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

• What are the information requirements?

• How is the information going to be collected and from what sources?

• What are the key timings and reporting milestones?

• What are the resources required?

Planning process 5.6

5.6 OE planning is fundamentally the same as planning any military activity and follows the basic JMAP. The same basic planning steps apply, including joint intelligence preparation of the battlespace (modified for OE purposes and called the OE preparation of the battlespace (OEPB)), mission analysis, course of action (COA) development, COA analysis, and decision and execution. The OE planning process is illustrated in figure 5–2 and includes the following key features:

• Operational evaluation preparation of the battlespace. Evaluators familiarise themselves with the OE environment from cognitive and physical perspectives. They undertake whatever research is necessary to provide them with appropriate knowledge to undertake the OE task. They may need to consult with or recruit subject matter experts to the team. In some instances specific training may be required. This step may involve researching previous lessons and their status of implementation, OE reports, doctrine, tactics and procedures. OEPB remains an ongoing activity throughout the planning phase and continues into the conduct phase as required.

• Mission analysis. This step involves reviewing the situation, analysing the initiating commander’s intent in order to identify OE mission and tasks, and identifying any significant issues for resolution.

• Course of action development. This step involves identifying form and approach options1, methods of collecting information and information sources. The conditions under which OE subject tasks are to be executed are identified, and standards and performance measures are identified. A work schedule with analysis and reporting milestones is drafted.

• Course of action analysis. The focus of this step is consultation and negotiation with stakeholders in order to identify the preferred COA. It may be helpful to circulate a draft OE plan for comment during this step.

1 See chapter 1, annex A to for information on evaluation forms and approaches.

5–3

Page 58: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

5–4

• Decision and execution. The preferred COA is one that has broad stakeholder agreement and achieves the mission within timings and resources. The draft plan is finalised and the initiating commander is back briefed. The final plan is published widely once the initiating authority has approved the plan and issued appropriate orders, directives and/or instructions.

Figure 5–2: The operational evaluation planning progress—joint military appreciation process modified

5.7 Operational evaluation plans. OE plans are important documents because not only do they guide the conduct of an OE activity, but they also advise a wider audience of the purpose, method and end state sought. They are constructed so that the base document is readily digested by a wide audience and details regarding the conduct of the OE are contained in annexes. The essential elements of a plan are:

• forms and approaches to be used;

• collection plan detailing the strategy for the assembly of evidence (delineates the method(s) of data collection and reduction for each evaluation objective);

• tasking arrangements;

Page 59: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

• key timings and reporting milestones;

• resource arrangements; and

• command and control.

5.8 OE planning will also be assisted by the Australian Joint Essential Tasks (ASJETS) list, which provides a generic list of joint tasks at the strategic, operational and tactical levels. Each task includes a description and a statement of the conditions, standards and performance measures that may apply to the task. Both ADFAADS and ASJETS are linked to the Australian Defence Force Activity Management System on the Defence Secret Network, and form part of the Preparedness Management Information System (PMIS)2.

5.9 A planning check list and plan outline are in annex A and Brespectively.

Collection plan 5.10

5.10 The success of an OE hinges on the execution of an effective collection plan. Appropriate methods of collection need to be employed, and data collectors may use a range of methods to ensure the data collected presents a balanced picture and it is always preferable to sample a wide range of data sources if possible. The data collection methods selected will depend on the nature (eg documents, processes and individuals), accessibility (eg permissions required and security considerations) and data location. The main OE data collection methods include observation, interview and document review. Generally, data collected answers one or more of the following questions: what, who, when, where, how and why. The key methods of data collection are as follows:

• Observation. Observations provide valuable first hand evidence of actual occurrences and are often aided by check lists based on previously determined information requirements. Observers have a sound understanding of the activity they are observing and are able to discern unusual from typical events. Observations may be followed up by interviews.

• Interview. Interviews are conducted by evaluators with strong interpersonal skills who are not going to be intimidated or sidetracked during the interview. Interviews are planned and structured to achieve an information outcome, and are often more effective if used in conjunction with a questionnaire that sets the scene and allows the interviewee to consider the issues likely to be raised face to face.

2 For more information on the PMIS see Australian Defence Doctrine Publication 00.2—Preparedness and Mobilisation (Provisional), chapter 2.

5–5

Page 60: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

Specialist expertise may be sought to assist with the design of surveys and questionnaires.

• Document review. Document reviews, particularly of operational documentation such as orders, instructions, logs and other records provide evaluators with definitive data and a basis for comparing actual operational outcomes with intended outcomes. Preliminary analysis of such documents along with doctrine, standard operating procedures, previous OE knowledge and higher command orders and instructions will assist evaluators to understand the performance standards that are expected of the subject of the evaluation and identify variations.

5.11 Internal or external evaluators. A question that often arises when planning an OE activity is whether evaluators should come from within the organisation that is the subject of OE or from an external source (either external to the OE subject or external to Defence). The obvious criticism of internal evaluators is that they are likely to be biased and could be subjected to undue influence by their colleagues and superiors. Conversely, external evaluators are seen to be ‘all care and no responsibility’ or lacking in subject matter expertise that internal personnel have acquired. There is no simple answer to this question or absolute right or wrong. Each option has its strengths and weaknesses and both are viable providing certain basic checks and balances are in place. In deciding one way or the other it should be recognised that the task of evaluating involves a range of skills and knowledge relating to evaluation theory, methods and approaches. Individuals selected to undertake an OE task should be trained in these matters as a minimum prerequisite.

5.12 Given that evaluation involves a high degree of human interaction, selection of an OE team is an important step in ensuring the effectiveness of the OE task, and trained evaluators are always preferred. Other factors that may influence both internal and external OE options are as follows:

• Availability. Commanders who have ongoing or frequent requirements for OE should ensure they have enough trained OE staff to meet their needs, and identify suitable externally sourced evaluators.

• Subject matter expertise. Internal evaluators usually have good local expertise, although it may be relatively narrow. Good evaluators have a breadth as well as depth of knowledge and expertise.

5–6

Page 61: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 5

• Information collection skills. Regardless of the source of evaluators, they should be trained in various information collection techniques. External evaluators may be more prepared to delve into an organisation to obtain information, and obtain more candid perspectives from interviewees.

• Flexibility. Regardless of their source, evaluators should be prepared to alter the course of an evaluation as required to achieve their mission.

• Perceived objectivity. It is a matter of human nature that evaluators, whether from an internal or external source, will make personal value judgements about their environment. Objectivity in OE will be achieved by ensuring that there are independent common sense checks on the validity of information collected, and this applies equally to internal and external evaluators. Professional or experienced evaluators can reasonably be expected to be objective in their work, regardless of their personal views. The question of perceived objectivity may be important in certain circumstances, and external evaluators will be presumed to be impartial ahead of internal evaluators. If in such cases internal evaluators must be used, their record of impartiality on other tasks will be a significant factor.

• Willingness to criticise. Evaluators should not refrain from asking the hard questions and being forthright in making recommendations. Internal evaluators may baulk at this level of frankness through fear of the professional consequences of doing so. External evaluators, including those external to Defence, are likely to be more robust in this respect, although they too may prefer to provide a favourable report with a view to obtaining further work. In both cases the evaluators should let the recorded evidence lead their work.

• Experience. Good OE requires evaluators who understand the purpose of their work and how to go about it in a professional manner. Good evaluation involves the use of sound and proven techniques and procedures, which need to be learnt and practised.

Annexes:A. Operational evaluation planning check listB. Operational evaluation plan outline

5–7

Page 62: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 63: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

Annex A to ADDP 00.4 Chapter 5

OPERATIONAL EVALUATION PLANNING CHECK LIST A

Operational evaluation preparation of the battlespace 1

1. Where is the operational evaluation (OE) to be conducted?

2. What are the physical conditions?

3. What is the nature of the operation/exercise?

4. What previous OE has been conducted?

5. What doctrine, tactics and procedures are relevant to the task?

6. Who are the subject matter experts?

Mission analysis 7

7. What is the background to the task?

8. What is the initiating commander’s intent?

9. What is the OE mission and tasks?

10. What are the key objectives and commander’s information requirements?

11. What limitations apply?

12. What resources are available?

13. What are the critical issues and assumptions?

14. What is the team leader’s initial planning guidance?

Course of action development 15

15. What are the OE key form and approach options?

16. What are the data collection options?

17. What are the data sources?

18. Who are the stakeholders?

19. What conditions, standards and performance measures are to be used?

5A–1

Page 64: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

20. What are the key timings?

21. What are the course of action options (tasks, forms and approaches and resources)?

Course of action analysis 22

22. What course of action best meets the initiating commander’s requirements?

23. Which courses of action are endorsed by stakeholders?

Decision and execution 24

24. What are the overall strengths and weaknesses of the courses of action?

25. Which is the preferred course of action?

5A–2

Page 65: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

Annex B to ADDP 00.4 Chapter 5

OPERATIONAL EVALUATION PLAN OUTLINE B

Initiating commander’s intent 1

1. What is the initiating commander’s purpose, method and end state?

2. What are the critical issues affecting the operational evaluation (OE)?

Situation 3

3. What is the situation regarding the OE subject (operational, exercise or other)?

Mission statement 4

4. What is the OE mission?

5. What is the OE commander’s intent?

6. What is the general outline of the form and approach to be adopted?

Details 7

7. What is the scheme of manoeuvre and key locations?

8. What are the detailed OE tasks?

a. Who are the team members?

b. What activities are to be evaluated, including conditions, standards and performance measures?

c. What are the information collection requirements and methods?

d. What are the analysis tasks?

e. What is the reporting schedule?

9. What is the allocation of resources to tasks?

Coordination 10

10. What are the key timings and milestones for collecting evidence and providing progressive reports?

11. What are the after action analysis and reporting arrangements?

5B–1

Page 66: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Administration 12

12. What are the movement arrangements?

13. What are the information system arrangements?

14. What are the personnel administrative support arrangements?

Command 15

15. What are the command and control arrangements within the team?

16. What command and control arrangements apply to the team?

5B–2

Page 67: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 6

CHAPTER 6

CONDUCT 6

Introduction 6.1

6.1 The conduct of OE involves implementing an OE plan, and is the third phase in the OE model (see figure 6–1). The OE conduct phase has two broad and overlapping parts: the first focuses on all the tasks associated with the collection and preliminary analysis of data, and the second involves detailed analysis and final reporting. These parts may be separated in time, and are distinctive because of the different skills required to complete their associated tasks. The major tasks include collecting and progressively reviewing data, conducting preliminary analysis, providing progressive reports, conducting detailed analysis, and reporting the findings. These tasks are conducted iteratively until all OE objectives have been achieved. The key outputs of OE conduct are lessons and recommendations.

Executive summary

• Conduct includes two broad and overlapping parts: the first part involves carefully orchestrated concurrent activity to collect and analyse data, and the second part involves detailed analysis and reporting.

• The conduct of operational evaluation (OE) requires a team effort by personnel with specialist data collection, data analysis and report writing skills.

• The OE conduct phase consists of the following activities:

– OE briefing;

– information collection;

– preliminary analysis and reports;

– out-briefing;

– detailed analysis; and

– final reporting.

6–1

Page 68: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 6

Figure 6–1: Operational evaluation model—conduct

6.2 OE is conducted by activity participants and/or OE teams. The tasks and process described in this chapter apply to both situations, except where indicated otherwise.

Operational evaluation team 6.3

6.3 OE teams require appropriately skilled personnel, particularly in relation to data collection and analysis. OE teams are structured according to the objectives of the evaluation. OE team(s) drawn from external sources may supplement an internal evaluation. OE teams follow the Australian Defence Force (ADF) OE system and utilise the Australian Defence Force Activity Analysis Database System (ADFAADS) database in the same way that internal OE personnel raise issues, recommend resolutions and submit activity reports. An OE team is a task oriented group of subject matter experts to actively collect specific information and apply professional military judgement and/or scientific/operations analysis to present observations, insights, issues, lessons and issue resolutions back to the sponsor.

6–2

Page 69: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 6

6.4 Team tasks. OE teams assembled for the conduct phase include team leaders, data collectors, analysts and report writers. Team leaders supervise the evaluation, liaise with activity participants, ensure that objectives are being met and provide progress reports. Data collectors liaise with activity participants, collect data and pass it to analysts. Analysts interpret data against the OE objectives and standards, identify gaps in data and prepare progress summaries. Report writers write interim reports and draft final reports. Most of these activities can and should occur concurrently. In the ADF there is usually some overlap in personnel across these functions, although some separation of tasks and personnel provides a useful check and balance against subjectivity. For example, analysis can be undertaken or led by personnel who have not been directly involved in collecting data in order to offset any tendency for preconceptions or bias to creep into the analysis. Personnel constraints often dictate that a teams members will be required to undertake all evaluations tasks eg data collection and analysis and write reports.

6.5 Tools. The conduct phase usually poses a significant information management challenge for OE teams and activity participants. Data will be collected in various forms, such as documents, observation reports and records of interviews. Data is transferred to the information system supporting the OE as soon as possible, so the routine of OE teams and activity participants need to allow adequate time for progressive data entry. This will allow commanders, OE staffs, OE team leaders and analysts to track and review the data collected in a timely manner and generate follow up action as required. Advantage should be taken of available information systems to assist with creating an open and collaborative evaluation environment. The ADFAADS, available on the Defence Secret Network is the prime vehicle for OE teams and activity participants to publish their findings in the form of ADFAADS issues and resolutions1. ADFAADS is also available in a deployable form.

Conduct activities 6.6

6.6 The conduct phase is a period of intense concurrent activity requiring careful orchestration to ensure the OE mission is achieved. The overall concept for this activity is to observe the activity in a controlled manner to document issues that will inform lessons, conclusions and recommendations. The phase consists of the following activities:

• OE briefing;

• information collection;

1 See chapter 10—‘Australian Defence Force operational evaluation system’ for details of ADFAADS.

6–3

Page 70: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 6

• preliminary analysis and reports;

• out-briefing;

• detailed analysis; and

• final reporting.

6.7 Where an OE task is relatively small or simple, preliminary reports may not be required and the out-briefing may be conducted as part of final reporting.

Operational evaluation briefing 6.8

6.8 Before an OE commences a briefing is conducted or distributed to ensure that the activity participants and stakeholders are fully aware of the purpose, method and end state of the evaluation. Key aspects of the OE plan such as the objectives, collection methods and key timings are highlighted. The briefing seeks to establish an open and collaborative approach from the outset. Participating commanders are responsible for ensuring that their subordinates are briefed on the OE plan. When an OE team has been established the OE team leader conducts the briefing and also introduces the OE team to activity participants

Data collection 6.9

6.9 The success of an OE hinges on effective data collection. Data collectors may use a range of methods to ensure data presents a balanced picture. The data collection methods used will depend on the nature (eg documents, processes and individuals), accessibility (eg permissions required and security considerations) and location of the data. The main OE data collection methods include observations, interviews and document reviews (detailed in chapter 5—‘Planning’, paragraph 5.10). Generally, data collected answers one or more of the following: what, who, when, where, how and why.

6.10 OE teams and activity participants progressively collect and collate data in accordance with the collection plan, checking for accuracy, relevance and completeness as they proceed. Evaluators need to maintain a clear focus on the purpose and objectives of the OE. Data collection needs to be free of bias, corroborated and objectively gathered and analysed. Regular peer reviews of data collected can be useful in developing a heightened sense of objectivity, allow preconceptions to be challenged, and determine an appropriate cessation point for the collection activity. Data collection is conducted in accordance with the collection plan detailed in chaper 5, paragraph 5.9.

6–4

Page 71: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 6

Preliminary analysis and reports 6.11

6.11 The information assembled is progressively analysed to identify issues and trends and identify where additional collection is needed. Activity participants are provided feedback and preliminary reports are provided to the OE initiating authority and stakeholders. The preliminary analysis should present a clear and concise description of the problem and recommendations for detailed analysis leading to a solution.

Out-briefing 6.12

6.12 Once data collection and preliminary analysis has been completed, activity participants are provided with a briefing that highlights preliminary findings and issues that may be addressed immediately. Out-briefings are normally provided by those personnel who conducted the initial OE briefing. Care needs to be taken to qualify any preliminary findings that require further analysis and consultation before being accepted.

Detailed analysis 6.13

6.13 The data collected is subjected to detailed analysis that identifies the full implications and relevance of what has been recorded. Data is weighted for importance and reliability, and correlations from various sources are identified. Particular care is exercised when analysing a typical or unusual occurrences to ensure their significance is not overstated. The analysis draws specific and strongly supported conclusions in preference to broad and largely subjective conclusions. Making value judgements about the conclusions is a necessary step and means deciding if the outcome is essentially good or bad. It may be appropriate to consult stakeholders at this point in the process and draw on their knowledge and experience when making such judgements. Consultation is also part of developing recommendations, as the full implications of proposed actions need to be understood, particularly when there are resource implications and policy changes involved. Recommendations only address specific issues and describe specific remedial actions. Action officers need to be identified by appointment, after having been consulted to confirm the appropriateness of their nomination.

6.14 The value of an OE activity is added and realised through the analysis of data collected. Progressive analysis drives the data collection effort to ensure that information requirements have been satisfied and useful data has been gathered. Detailed analysis strives to deal with facts and compelling evidence rather than subjective opinions and broad generalisations. Analysis also recognises the impact that various environmental conditions have on the performance of military tasks when performance is assessed against

6–5

Page 72: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 6

standards. Analysis requirements need to be considered when specific information requirements and data collection methods are identified during the OE planning phase so that appropriate analysis resources are provided.

6.15 Analysis methods. OE analysts need to be skilled in assembling, displaying, aggregating and reducing data and drawing conclusions. Data will be collected in both qualitative and quantitative forms, and analysis will need to identify the relative merits of the data that is presented.

• Qualitative analysis. Qualitative analysis involves understanding human behaviour and the characteristics that govern human behaviour and thinking, particularly how and why decisions are made. Qualitative data is invariably subjective and therefore needs to be corroborated by data from other sources or rated accordingly. Qualitative data is human judgment obtained from questionnaires, interviews, surveys and observation records that can usually be depicted in narrative or pictorial form.

• Quantitative analysis. Quantitative analysis involves the use of mathematical models, theories and hypotheses. The process of measurement is central to quantitative analysis as it establishes the relationship between empirical data and quantitative standards. Quantitative data is numerical or statistical data such as percentages, frequencies, ranges and medians that can be expressed in mathematical terms and tabulated or graphed.

6.16 Conclusion. The conclusion presents a clear and concise description of the problem, the methodology by which the conclusion was reached and recommendations for the resolution of the problem. The key products of the detailed analysis are the lesson(s) identified and the recommendations needed to address the lesson.

Final reporting 6.17

6.17 An OE activity is not complete until the final report and associated recommendations have been provided to the commander who initiated the activity. Final reports and records are to be clear, unambiguous, and supported by evidence. Timely reporting is particularly critical in relation to the OE of operations, and interim reports are provided if appropriate. The final report responds directly to the initiating instruction. Over-classification of reports that might restrict wide distribution is to be avoided. OE reports may take many forms, depending on the requirements of the initiating authority. Oral reports are also provided to the initiating authority and key stakeholders, and the subject of the OE is normally offered a detailed oral debrief. Draft reports are circulated to stakeholders for comment prior to being submitted to the initiating authority. Final reports are usually in the format of an executive summary and are included on the relevant activity page in ADFAADS.

6–6

Page 73: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 7

CHAPTER 7

IMPLEMENTATION, MONITORING AND REVIEWING 7

Introduction 7.1

7.1 The most significant phase of the OE model (see figure 7–1) involves implementing the recommendations of OE reports, monitoring their implementation and reviewing the overall impact in terms of capability and preparedness enhancements. The other stages of the model are redundant if not followed by robust implementation, monitoring and review activity. Implicit in this phase is the leveraging and application of knowledge arising from OE activity. This is a challenging phase in the OE system because it deals with immediate, mid-term and long-term matters that require persistent attention over time. The phase is also significant for identifying new OE requirements and thereby eliminating wasteful, piecemeal or ad hoc OE activity.

Executive summary

• Implementing, monitoring and reviewing recommendations from operational evaluation (OE) activities realises the immediate, mid-term and longer term OE benefits.

• Assisting the management of change is a principal characteristic of this phase.

• Following implementation, the OE subject is monitored to ensure that intended outcomes and consequences are realised, and unintended consequences are identified and managed. Effectiveness and efficiency is also monitored.

• Reviewing is in effect a macro evaluation activity that considers the full and wider impact of OE at the tactical, operational and strategic levels of command.

7–1

Page 74: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 7

Figure 7–1: Operational evaluation model—implement, monitor and review

7.2 This phase serves three purposes:

• implementation of decisions requiring specific actions to be taken,

• contribution to knowledge capital, and

• application of knowledge.

7.3 One of the immediate challenges of this phase is to overcome the criticism that the same lessons seem to be repeatedly re-learnt. The persistence with which this criticism is made is perhaps one of the more obvious indicators of the status of the Australian Defence Force (ADF) as a learning organisation and the degree of acceptance of an OE culture.

7–2

Page 75: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 7

7.4 Another perspective on implementation, monitoring and review is to consider the types of change that OE more commonly generates and the challenges associated with those changes. The types of change are as follows:

• Procedural. Procedural change involves amendment to procedures, and may have implications for standard operating procedures, training, doctrine and policies. Procedural changes are relatively quick and simple to institute and monitor.

• Structural. Structural change relates to organisations and their personnel and equipment. Structural change is generally more complex and may take time to institute.

• Behavioural. Behavioural change involves changing the way people behave and think, and may have doctrine and training implications. Behavioural change is complex and may need an extended and multi-faceted change program.

• Cultural. Cultural change involves change to matters such as values, norms and beliefs. Cultural change is the most difficult form of change to bring about because of its intangible nature. Cultural change is likely to require an extensive change program conducted over an extended period.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Operational health risk countermeasures. All deployed personnel were required to accept a range of health risk countermeasures. This included inoculation against Anthrax, which at that time was considered likely to be used as a biological weapon by Iraq. Procedural factors led to inoculations becoming a contentious issue during deployment.

This issue highlighted competition between individual’s concerns over their personal risks and the collective risk to teams posed by some individuals not having the full range of countermeasures.

Defence has revised procedures to ensure personnel are fully informed of any potential health risks inoculations could pose, operational requirements for inoculations and health risk countermeasures.

7–3

Page 76: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 7

Temporal aspects 7.5

7.5 The multi-faceted nature of implementation, monitoring and review is illustrated by the temporal framework that can usually be applied to OE outcomes. The framework includes three levels:

• immediate outcomes are relatively tactical in nature, usually have singular implications and are generally implemented by the subject of OE;

• mid-term outcomes are generally operational in nature and usually involve some degree of aggregation of information; and

• longer term outcomes are generally strategic in nature, have policy implications and deal with broad aggregations of information.

Implementation 7.6

7.6 Implementation addresses the specific recommendations arising from OE activities, and the broader application of knowledge to affect change. Implementation does not have to wait until an OE activity is concluded. Commanders should be prepared to seize opportunities to remedy problems within their control, implement changes and immediately assess their impact. The implementation of specific recommendations needs to proceed in a timely manner, as there is often benefit in taking advantage of the enthusiasm for improvement and level of interest that an operation or exercise generates, before it is overtaken by other commitments or before key personnel move on.

7.7 OE staff has an ongoing responsibility to track the progress of their recommendations, where possible, to ensure they are implemented in the way that was intended. Their first concern is to ensure that recommendations are expressed clearly and comprehensively, and indicate the intended outcomes. Recommendations that make reference to the fundamental inputs to capability (see chapter 1—‘Introduction to operational evaluation’, paragraph 1.27) and/or the joint functions (command and control, intelligence, offensive action, movement and manoeuvre, force protection and sustainment) will be able to be implemented and tracked in a coherent manner. More complex implementation may involve several actions, such as changes to training and doctrine, which need to be coordinated. OE staff can ensure that this need for coordination is recognised by action officers.

Monitoring 7.8

7.8 Monitoring the implementation of OE recommendations is in effect an evaluation activity in its own right. It is primarily the responsibility of commanders, with specialist assistance provided by OE staff. It involves firstly checking that recommendations have been actioned, and secondly

7–4

Page 77: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 7

monitoring performance after recommendations have been implemented to confirm that the change is having the desired outcome and that unintended consequences, if any, are not having a detrimental effect. The ADF Activity Analysis Database System provides a semi-automated means of ensuring that recommendations have been actioned and visibility of the status of recommendations. Monitoring occurs in two stages and involves the tasks illustrated in figure 7–2.

7.9 This activity is rarely conducted as a discrete OE activity in the ADF. Rather it is addressed as part of the normal OE tasks. As part of the planning process OE staff identifies previous recommendations and actions to ensure that collection will answer questions such as the following:

• Have implemented recommendations had the desired effect?

• Is the issue completely resolved?

• Has the lesson been universally applied?

• Was the original analysis correct?

• Is there follow up action required that may lead to new evaluation requirements?

Figure 7–2: Monitoring operational evaluation—two stages

7–5

Page 78: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 7

Reviewing 7.10

7.10 Reviewing OE is best thought of as a macro level of evaluation that considers the full and wider impact of OE at the tactical, operational and strategic levels of command. From these levels of review the impact of OE on entire programs may be scrutinised from broad perspectives to assess the overall impact of change. Reviews avoid being distracted by the details of specific recommendations, which are the concern of implementation and monitoring. Reviews should consider broad questions such as the following:

• Is OE moving the ADF in the right direction at the right speed and achieving its goals?

• Is OE enhancing capability and preparedness?

• Is OE contributing to the knowledge edge being attained?

• Is the collective training program effective?

• Is individual training effective?

• Is doctrine effective?

• Is knowledge being applied?

• Have policy changes generated by OE been effective?

• Is the ADF growing as a learning organisation?

• Is an OE culture growing?

7.11 Review activities do not overlook the need to periodically review the OE process and program itself. This inwards focused review of OE processes and procedures is known as meta-evaluation, and is discussed in chapter 9—‘Meta–evaluation’.

7–6

Page 79: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 8

CHAPTER 8

OPERATIONAL EVALUATION KNOWLEDGE DOMAIN 8

Introduction 8.1

8.1 The OE knowledge domain is that domain in which OE knowledge is transformed into applied knowledge. OE knowledge capital is information that is gathered from activities in various forms such as lessons and reports. Much of the OE knowledge obtained is applied almost immediately to address specific issues that have arisen. Leveraging knowledge is the advantage gained by being in a position to use previous lessons shared across headquarters or organisational boundaries for continued application to the enhancement of capability and preparedness.

8.2 Chapter 7—‘Implementation, monitoring and reviewing’ indicates that the implementation process addresses the specific recommendations arising from OE activities, and the broader application of knowledge to preparedness and capability enhancement to effect change. Implementation is in effect the application of OE knowledge, and it is this implementation activity that is central to the utility of the OE knowledge domain. The OE knowledge domain is not a passive entity; it incorporates the active involvement of people who research and analyse OE knowledge from their various perspectives and apply that knowledge. The utility of the OE knowledge domain is dependent on the quality and quantity of data collected. See chapter 6—‘Conduct’ for more information on data collection.

Executive summary

• The operational evaluation (OE) knowledge domain:

– is a multi-faceted domain that gathers, disseminates and applies OE knowledge;

– adds value to OE knowledge over time through iterative analysis at all levels of command;

– is an active domain that consists of many components, such as people, processes, content and technology; and

– is a component of the greater Defence knowledge management strategy and makes a fundamental contribution to operations.

8–1

Page 80: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 8

8.3 The OE knowledge domain broadly operates in three temporal dimensions as follows:

• the immediate, when the analysis and application of lessons is for immediate and usually local benefit;

• the mid-term, when benefits that accrue from aggregating lessons with a common theme or context from several activities are realised; and

• the longer term, when OE helps to gain efficiencies, resolve capability gaps, improve policies and inform strategic planning and analysis.

Building knowledge 8.4

8.4 One of the key activities within the OE knowledge domain is building on knowledge capital obtained from Australian Defence Force (ADF) activities through analysis at higher levels, culminating in the application of OE knowledge to preparedness and capability enhancement. The longer term watch over OE knowledge and analysis requires commanders to assign members of their staff to this task, rather than leaving it to chance. These higher levels of analysis may involve:

• categorising, aggregating and monitoring issues over time to identify significant trends; and

• flagging recurrent or conflicting issues that have wider implications for a theatre of operations, a force element group, a single-Service or across the ADF.

Operational evaluation knowledge components 8.5

8.5 For the OE knowledge domain to be useful and effective people must be engaged in and exploit it, and it must be managed so that OE knowledge is actively transferred and applied. The OE knowledge domain consists of the following components:

• A framework of organisational outcomes, that provides a common purpose behind the fostering of knowledge, capability improvement objectives, innovation imperatives and accountability.

• A culture of wanting to learn, evaluating performance, applying lessons and using doctrine. The ADF has an emerging OE culture that needs to grow, primarily through training and education programs.

• Incentives that cause the OE knowledge domain to be used routinely. While the culture grows, incentives in the form of directives, orders and instructions are needed to promote use of OE knowledge.

8–2

Page 81: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 8

• Essential elements, including people, processes, content and technology. The active involvement of people in leveraging knowledge invigorates the domain. Processes, content and technology provide the inanimate features of the domain.

• Enablers, including information systems and their associated processes. OE has a comprehensive enabler in place in the form of the Australian Defence Force Activity Analysis Database System (ADFAADS) and single-Service evaluation databases, coalition evaluation databases, other information systems such as the Defence Secret Network and Defence Restricted Network, evaluation help desks, other related information systems such as the Preparedness Management Information System and other Defence-wide knowledge management systems that regulate knowledge flows.

• Networks and communities that take a special interest in OE knowledge and promote its use. The network of ADFAADS managers provides the basis of an OE network, and they are responsible for growing an OE community.

• Champions and advocates are needed throughout the chain of command to lead the ADF in participating in the OE knowledge domain. The Australian Defence Force Warfare Centre provides advocacy through its OE staff.

LESSON FROM ADF OPERATIONS IN THE MIDDLE EAST IN 2003

Interoperability. The benefits of continuing high levels of interoperability with our friends and allies, especially the United States, are a major factor in successful coalition operations.

Defence will continue to develop international relationships and its ability to operate with allies and partners. A specific area for development includes information sharing on command and control and information systems. Development of personal networks will be promoted through exchange and liaison officer programs to facilitate intelligence sharing and to allow speedy resolution of coalition operation issues. Operational deployment of Australian officers already on exchange with overseas forces presented some legal and administrative challenges which are also being addressed.

8–3

Page 82: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 8

Characteristics 8.6

8.6 An effective OE knowledge domain will enable the ADF to achieve the knowledge edge and may also lead to greater innovation, higher levels of application, and knowledge accessed across the ADF. ADF OE knowledge is also extended into the international OE knowledge domain in the interests of enhancing interoperability. This is achieved through sharing OE knowledge during combined activities and through various interoperability fora.

8.7 The collaborative environment of the OE knowledge domain facilitates the access and sharing of knowledge across organisational boundaries. The ability to access and use knowledge in a practical and meaningful manner ensures the continued learning of the organisation and ideally eliminates the need to re-learn lessons. Effective knowledge management enables Defence to better use its time and efforts to achieve operational goals.

8–4

Page 83: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 9

CHAPTER 9

META–EVALUATION 9

Introduction 9.1

9.1 Meta-evaluation literally means the evaluation of OE itself. The purpose of meta-evaluation is to ensure that OE best practice is maintained, and that OE methods, approaches and conduct continue to be effective. Meta-evaluation can take the form of formal evaluations conducted in accordance with OE procedures described in chapters 4 to 7 of this publication, or be conducted informally by evaluators (the latter occurs routinely). Meta-evaluation involves evaluating such aspects as OE policy, doctrine, methodology, practice, effectiveness and efficiency. Meta-evaluation can focus on the effectiveness of a particular OE activity or assess the ongoing effectiveness of the OE program. The requirement for meta-evaluation recognises that evaluation is vulnerable to faults such as bias, technical errors in data collection and analysis, misuse and misunderstanding. It also recognises that evaluation as a discipline is actively debated and continues to evolve.

Forms and purpose 9.2

9.2 Meta-evaluation can take several forms and be conducted at all levels, depending on its purpose. The common reasons for conducting meta-evaluation relate back to the application of the OE standards of accuracy, feasibility, accountability and utility (see chapter 3—‘Operational evaluation framework’) as follows:

• Accuracy.

– Is the assembled evidence balanced, comprehensive and does it provide a true picture of the OE subject?

– Do the findings reflect the evidence?

Executive summary

• Meta-evaluation ensures that operational evaluation (OE) methods, approaches and conduct continue to be effective.

• Meta-evaluation can focus on single OE activities or the entire OE program.

• The OE standards of accuracy, feasibility, accountability and utility provide a basis for conducting meta-evaluations.

9–1

Page 84: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 9

– Was the conduct objective and fair?

– Were the conditions, standards and performance measures relevant to the actual circumstances?

• Feasibility.

– Was the task correctly described in the initiating instruction?

– Were appropriate resources committed to the task?

– Was the task cost-effective?

– Were the evaluators appropriately trained and experienced for the task?

– Were the OE subject and other stakeholders appropriately engaged?

– Was the timing of the evaluation appropriate?

• Accountability.

– Were various responsibilities for the evaluation specified and accepted?

– Were decisions made by appropriate officers?

• Utility.

– Are the findings accepted by stakeholders as being reliable?

– Have evaluation products (lessons and recommendations) been used?

– Have the findings improved preparedness and enhanced capability?

– Are there any detrimental consequences?

9.3 Meta-evaluation generally takes one of the following forms:

• An internal review by evaluators at the conclusion of an OE activity. This is routine practice for all OE teams and activity participants who have conducted internal OE, and is the culmination of quality checks conducted during the OE activity. Its main purpose is for evaluators to practice what they preach and take steps to improve their performance.

9–2

Page 85: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 9

• Formal meta-evaluation, usually commissioned by the initiating authority, and conducted at the conclusion of an activity by other evaluators. This meta-evaluation is usually conducted as a quality control measure to confirm the suitability and quality of the evaluation form and approach followed for that activity.

• Formal meta-evaluation of the OE program. This level of meta-evaluation is conducted to review the OE program in its entirety, including policy, doctrine, organisation, practices, utility, reliability, resources and impact on other programs (particularly preparedness and capability enhancement). Ultimately this level of meta-evaluation seeks to maintain best practice in the OE program, and ensure it remains relevant to the Australian Defence Force’s performance objectives.

9.4 A guide to conducting meta-evaluation is in annex A.

Annex:A. Meta–evaluation guide

9–3

Page 86: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 87: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

Annex A to ADDP 00.4 Chapter 9

META–EVALUATION GUIDE A

Initiation and planning 1

1. Evaluation management:

a. Did the initiating instruction for the operational evaluation (OE) identify and address the purpose of the OE and the relevant key issues?

b. Did the initiating instruction reflect higher level guidance?

c. Were the stakeholders involved in or affected by the OE properly identified?

d. Was the subject being evaluated, and the context within which the subject operates clearly defined?

2. The operational evaluation plan:

a. Were the risks clearly identified?

b. Was the OE team or other personnel resources appropriate for the task?

c. Was the OE management structure appropriate?

d. Was the OE work schedule feasible?

e. Were sufficient resources allocated to the OE tasks?

f. Were the methods of data collection and samples appropriate for gathering the information needed?

Conduct 3

3. Data collection and analysis:

a. Were the collected information and the evidence produced:

(1) accurate (obtained through sufficiently precise and consistent measurement procedures);

(2) reliable;

9A–1

Page 88: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

(3) valid (actually measure the characteristic or quality); and

(4) comprehensive and relevant (cover all relevant aspects of the key evaluation issues)?

b. Was the data appropriately and rigorously analysed to answer OE questions and do justice to the data?

c. Did the data collection and analysis give due attention to the interests of all OE subject groups and individuals?

d. Were the data collection processes accurately conducted?

4. Monitoring and accountability:

a. Was the implementation of the OE methodology monitored sufficiently to assess the validity of the findings?

b. Were expenditures associated with the OE accounted for and appropriate?

c. Were conflicts of interest dealt with openly and honestly?

Reporting 5

5. Evaluation product identification and methodology:

a. Did the report include an adequate description of the OE products and what was being evaluated?

b. Did the report include an adequate description of the OE methodology and it is implementation such that readers could assess its appropriateness?

c. Did the OE report acknowledge sources of information other than where confidentiality was assured or assumed?

6. Results and conclusions:

a. Was reporting of results fair and impartial?

b. Did the report make clear the links between findings and conclusions?

c. Were any qualifications in relation to the reliability of the findings included in the report?

9A–2

Page 89: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

7. Recommendations:

a. Were the recommendations credible and was their link to the results and conclusions clear? Did recommendations flow logically from the findings presented?

b. Were the assumptions used in the formulation of recommendations detailed to allow assessment of their appropriateness and feasibility?

c. Were the consequences of implementing (and not implementing) the recommendations made clear?

8. Reporting processes:

a. Were evaluation reports provided:

(1) at appropriate stages during conduct of the OE,

(2) to appropriate audiences,

(3) in formats and styles that meet the needs of various audiences, and

(4) in a timely manner to suit the needs of decision makers?

General 9

9. In terms of the OE standards, was the OE:

a. Accurate:

(1) Was the assembled evidence balanced, comprehensive and did it provide a true picture of the OE subject?

(2) Did the findings reflect the evidence?

(3) Was the conduct objective and fair?

(4) Were the conditions, standards and performance measures relevant to the actual circumstances?

b. Feasible:

(1) Was the task correctly described in the initiating instruction?

(2) Were appropriate resources committed to the task?

9A–3

Page 90: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

(3) Was the task cost-effective?

(4) Were the evaluators appropriately trained and experienced for the task?

(5) Were the OE subject and other stakeholders appropriately engaged?

(6) Was the timing of the evaluation appropriate?

c. Accountable:

(1) Were various responsibilities for the evaluation specified and accepted?

(2) Were decisions made by appropriate officers?

d. Useful:

(1) Were the findings accepted by stakeholders as being reliable?

(2) Have evaluation products (lessons and recommendations) been used?

(3) Have the findings improved preparedness and enhanced capability?

(4) Are there any detrimental consequences?

9A–4

Page 91: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 10

CHAPTER 10

AUSTRALIAN DEFENCE FORCE OPERATIONAL EVALUATION SYSTEM 10

Introduction 10.1

10.1 The purpose of this chapter is to describe the current ADF OE system1. The current OE system is largely embodied in the ADFAADS, which is a semi-automated staffing system and Lotus Notes database available on the Defence Secret Network (DSN). ADFAADS is supported by a network of OE managers and practitioners, procedures and processes. The strength of ADFAADS is that its semi-automated features minimise the resources required to gather, analyse and disseminate OE data and knowledge. The system has been designed to support all forms of OE, including internal evaluation conducted by activity participants and OE conducted by OE teams drawn from participating FE or external sources.

Executive summary

• The Australian Defence Force (ADF) Activity Analysis Database System (ADFAADS) is the core of the current operational evaluation (OE) system. ADFAADS comprises a semi-automated staff process and a database.

• All force elements (FE) are required to evaluate all joint ADF activities in which they participate, using integral resources. OE by OE teams drawn from sources external to the participating FE may also be directed.

• The ADF OE system provides OE knowledge and a means of managing and disseminating that knowledge to enhance ADF capability and preparedness.

1 Principal references for the current OE system include: Australian Defence Force Publication 7.0.3—Exercise Planning and Conduct, chapter 6—‘Evaluation’, and Defence Instruction (General) OPS 41–1—Australian Defence Force Activity Analysis Database System. Detailed instructions on ADFAADS procedures are available from the ADFAADS database and via the Australian Defence Force Warfare Centre (ADFWC) website on the Defence Restricted Network (see http://intranet.defence.gov.au/VCDFweb/sites/adfwc/).

10–1

Page 92: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 10

System description 10.2

10.2 The current ADF OE system relies heavily on OE conducted by the FE participating in various activities, augmented on occasions by externally sourced OE teams. The system involves the following activities and steps:

• OE planning is initiated in operational warning orders and exercise concept documents. OE objectives are usually identified in conjunction with detailed activity planning. Activity tasks (for example operational tasks) and/or objectives (for example training objectives) are expressed in terms of Australian Joint Essential Tasks (ASJETS) by either activity planners or OE planners to establish a framework for evaluating the activity. ASJETS conditions, standards and performance measures provide the basis for OE data collection and analysis. Activity and OE objectives are recorded on the activity pages of the ADF Activity Management System, which like ADFAADS is resident on the DSN, and are automatically reproduced on the ADFAADS database. These linked activity pages provide repositories for all key information related to an ADF activity.

• Further detailed OE planning is undertaken within FE and by external OE teams if directed. OE planning for major joint and combined activities is usually undertaken by staff from the ADFWC Evaluation Section. OE plans are lodged on the relevant activity page on the ADFAADS database.

• Issues relating to any aspect of an activity are recorded during activities by activity participants and OE teams (when established). Early visibility of issues arising from an activity via the ADFAADS database enables timely action to be taken if appropriate.

• Issues are resolved and actions taken as required during and after an activity by designated action officers identified on the ADFAADS database. DSN users can track issue resolution progress via the database.

• Activity reports (commonly known as post activity reports) are progressively developed by activity participants and OE on the ADFAADS database. This occurs as part of the issues recording and resolution process. Activity reports are finalised as required during and/or after an activity. Reports follow a standard format that consolidates issues and resolutions in an executive summary generated within ADFAADS.

10–2

Page 93: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 10

• ADFAADS automatically generates messages that track actions to resolve issues. Automated messages also prompt action officers when required. This environment is known as the ADFAADS virtual office and is the core of the OE network.

• ADFAADS provides commanders and managers with a range of report options to monitor OE activity, and provides the ADF with an OE knowledge resource for research and analysis purposes.

• Further analysis of OE knowledge is conducted by commanders and staff as required. Specific information can be obtained by searching the ADFAADS database using a range of search options based on key words. Local procedures dictate the purpose, nature and frequency of this analysis.

10.3 Internal operational evaluation. The OE conducted by FE using integral resources to the FE is known as internal OE. The ADF OE system relies on all FE conducting internal OE of all joint activities in which they participate. FE commanders may direct individuals, functional groups or specially tasked OE individuals and teams to contribute to the OE effort. FE utilise ADFAADS as the primary means of recording, analysing and reviewing OE data. FE commanders are responsible for ensuring that members of their FE and organisations supporting their involvement in an activity utilise the ADFAADS database to raise issue forms and issue resolution forms and consolidate this information into an online post activity report.

10.4 External operational evaluation. Internal evaluation of activities may be augmented by OE conducted by an OE team drawn from external sources and directed by a superior commander. The external OE team may be drawn from ADF, Australian Public Service or civilian sources. The requirement for an external OE team to be formed will usually be contained in operational warning orders and exercise scoping documents. The resources currently available within the ADF to stand up an external OE team are limited. The ADFWC evaluation staff is the only standing joint OE capability. External OE teams are usually formed for operations and major joint and combined exercises. External OE teams follow the ADF OE system and utilise the ADFAADS database in the same way that internal OE personnel raise issues, recommend resolutions and submit activity reports.

10–3

Page 94: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 10

Australian Defence Force Activity Analysis Database System overview10.5

10.5 ADFAADS consists of two components: a semi-automated staffing system that generates a range of messages, and a database that contains OE data. The database comprises three basic work forms—an activity form, issue form and issue resolution form, and an activity report format. The purpose of the forms is as follows:

• Activity forms. Activity forms provide an entry point for each activity, record basic information about the activity and provide links to key documents. Activity forms act as a covering document for issue forms and issue resolution forms.

• Issue forms. Issue forms identify perceived strengths or weaknesses and record observations, analysis and lessons learnt. Issue forms are in effect a staff paper on each issue raised. Issue forms exist in draft and global environments. Draft issue forms are only visible to the local unit and can therefore only be edited by local personnel. Release of the issue to the global environment where all ADFAADS users can see the issue is controlled by locally authorised releasing officers. At this stage the issue is available for comment by any ADFAADS user.

• Issue resolution forms. Once the global issue has been commented upon, an issue resolution form is raised by the originating unit to record actions recommended to address the issue, and nominate an action organisation and appointment that, in the opinion of the originator, are best placed to take the necessary action.

10.6 Activity reports. In addition to the three forms, ADFAADS also generates a draft activity report for each major FE participating in an activity in the form of an executive summary linked to issue forms and issue resolution forms. The three work forms and executive summaries have been designed to capture the knowledge and experience gained by individuals and organisations when participating in ADF activities for the benefit of the wider ADF. Collectively this body of information constitutes the ADF’s OE knowledge, dating back to at least February 1998 when ADFAADS was introduced to service.

10.7 The key features of ADFAADS are as follows:

• It provides a transparent and collaborative environment for identifying and resolving issues.

• The status of issues released to the global environment can be monitored by all DSN users.

10–4

Page 95: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4 Chapter 10

• The application of evaluation knowledge to other programs can be centrally recorded.

• ADFAADS provides a means of managing OE knowledge. Research is facilitated by the inbuilt full text search and report features. The search feature includes all ADFAADS information and embedded documents.

• ADFAADS is integrated with the Preparedness Management Information System.

• ADFAADS can be transferred to local area networks in situations when the DSN cannot be deployed, with new information subsequently being transferred back to the main database.

• ADFAADS development is a collaborative process utilising its ‘Program Change Request’ feature. Any user can suggest a software change to the system.

10.8 An ADFAADS user guide providing more detailed information on how to use ADFAADS is in annex A.

Australian Defence Force Activity Analysis Database System management and training 10.9

10.9 The ADFWC evaluation staff is responsible for overall management of ADFAADS, supported by a network of ADFAADS managers representing the major Defence commands and organisations. An OE management group, comprising all ADFAADS managers, is responsible for overseeing ADFAADS policy, training, utility, operations and software developments.

10.10 The Joint Systems Support Agency, which is responsible for providing the DSN, is responsible for formal ADFAADS training. Self-paced training is also available through the ADFAADS database on the DSN or via the ADFWC website on the Defence Restricted Network.

Annex:A. Australian Defence Force activity analysis database system user

guide

10–5

Page 96: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 97: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

Annex A to ADDP 00.4 Chapter 10

AUSTRALIAN DEFENCE FORCE ACTIVITY ANALYSIS DATABASE SYSTEM USER GUIDE A

General 1

1. The Australian Defence Force (ADF) Activity Analysis Database System (ADFAADS) is both a system and a database that provides an electronic means of recording and staffing issues arising from ADF activities. ADFAADS consists of two components: a semi-automated staffing system that generates a range of messages, and a database that contains operational evaluation (OE) knowledge.

2. The purpose of this guide is to assist commanders, ADFAADS managers and general users to take advantage of the system and the database including its full range of features. The guide covers the following:

a. issue form content;

b. issue resolution form content;

c. responsibilities of reviewing officers;

d. responsibilities of releasing officers;

e. responsibilities of action officers;

f. executive summary (ES)/post activity reports;

g. duties of lead planning organisations;

h. duties of OE teams;

i. search facilities;

j. report generator facilities;

k. status of actions and reporting/action agents;

l. statistical analysis;

m. mobile ADFAADS;

n. integration of Australian Joint Essential Tasks (ASJETS);

o. re-submission of resolutions;

p. duties of ADFAADS managers; and

q. responsibilities of commanders.

10A–1

Page 98: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Issue form 3

3. The issue form is essentially the argument that identifies a strength, weakness or problem. The issue form should fully analyse the issue, record observations and comments, and identify the lesson learnt by the author. Together with the recommended action on the issue resolution form, the issue form can be compared to the traditional staff paper. As with other proposals, the eventual outcome will depend on a sound, well researched, comprehensive and logical argument such that the action authority is convinced that the issue warrants action. Some specific points on the issue form are as follows:

a. The summary of issue section should only be a couple of lines of text that provide a concise synopsis of the issue.

b. The amount of detail in the analysis section of the issue form should be commensurate with the complexity of the issue. A weak or fallacious justification on the issue form is likely to result in the action authority not implementing the proposal.

Issue resolution form 4

4. The issue resolution form is the heart of the system—specifying what the originating commander wants done about the issue and a proposed action organisation. It is deliberately brief so as to channel the drafter’s thoughts into a concise but succinct statement of action to be taken. Provided it is backed by a satisfactory analysis within the issue form, the action organisation should have little doubt as to what is being asked and why. Some specific points on the issue resolution form are as follows:

a. The recommended action is designed to funnel the drafter’s thoughts into defining exactly what action is being proposed (eg conduct a theatre-wide evaluation of command arrangements and implement changes necessary to enhance operational efficiency).

b. The proposed action organisation is the originating organisation’s assessment of the most suitable organisation within the ADF to resolve the issue. The proposed action officer is a suggested appointment within the proposed action organisation to accept the issue for action. Action officers can forward the issue resolution form up or down the chain of command to a more appropriate action officer if appropriate.

c. Having completed a new issue form and issue resolution form the drafter forwards it to a local reviewing officer for either further editing, comment, analysis or release to global.

10A–2

Page 99: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Responsibilities of the reviewing officer 5

5. The reviewing officer, as part of the issue staffing process, is being asked to value-add to the issue by editing and adding their differing/wider perspective on the issue. This step can be compared to the normal staffing of a document and therefore it is important to obtain input from all appropriate staff within the originating organisation. The reviewing officer (who may be a peer, subordinate or superior of the originator) must either:

a. edit the issue form and forward to another reviewing or releasing officer;

b. edit the issue form and release it to global if they are a releasing officer (this equates to the release outside of an organisation of other official correspondence such as a minute or message); or

c. delete the issue form if they decide it is unwarranted and provide the originator with appropriate feedback.

6. There will be instances where the reviewing officer decides to cancel a draft issue. In such cases it is important that feedback is provided to the drafter by use of the reasons for deleting issue function. Apart from being common courtesy, the comments will explain the decision, contribute to team building, and encourage further contributions.

Responsibilities of the releasing officer 7

7. It is important that each organisation identify in its standard operating procedures (SOP) for the use of ADFAADS which appointments may release ADFAADS issues to the global domain. Releasing officers should be selected in the same manner as those officers nominated to release correspondence or messages. Releasing officers release an issue and its resolutions to global on behalf of their commander, therefore they are responsible for ensuring the issue has been properly staffed, is a legitimate issue that accords with the commander’s view, and is within their authority to release.

Responsibilities of the action officer 8

8. The action organisation ADFAADS manager will receive an incoming resolution which will include a suggested position within their organisation to take the issue for action (proposed action officer). Although normally passed direct to the action officer, each organisation’s ADFAADS SOP should establish on what occasions the ADFAADS manager forwards the issue directly to the suggested action officer, a departmental head or the

10A–3

Page 100: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

commander. On eventual receipt of the resolution the action officer should process it in the same manner as any other written correspondence. This includes:

a. fully staffing the proposal,

b. determining the relevance of the proposal, and

c. taking action to resolve the issue.

9. Within the issue resolution form, action officers should take the following action:

a. Maintain a record of decisions/actions taken within the record of formal actions taken section. This should include an explanation in those circumstances when the proposal is not agreed and no further action is to be taken.

b. Forward the issue resolution form to other appropriate staff officers (via the send to next action officer button) for further staffing if required.

c. Forward the issue resolution form to an appropriate releasing officer (via the send to next action officer button) to flag resolution complete.

10. A quandary at this stage may be whether to flag resolution complete when action has been initiated, or wait until the action has been completed. This is ultimately the releasing officer’s judgment call. As a general rule, if processes have been set in motion such that there is high confidence the change will happen without any further input, the resolution should be flagged as complete. An additional option is to utilise the re-submit function (described in paragraph 30.) whereby resolutions can be flagged complete but set for re-submission at a later date for checking or review.

Post activity reports, executive summaries, issues and resolutions 11

11. ADFAADS negates the previous procedure to produce a lengthy written post activity report (PAR) that had limited distribution and no measurable means to implement resolution. The ADFAADS PAR comprises two parts:

a. a series of individual global issues and resolutions raised against the particular activity, and

b. an ES attached to the activity page.

10A–4

Page 101: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

12. Each ADF activity lead planning organisation is responsible for nominating the activity participants required to raise an ES. These details are to be found on the corresponding activity page on the ADF Activity Management System (ADFAMS). The sum of all the ES and the associated issues/resolutions becomes the collective PAR.

13. Executive summary format. Within ADFAADS the production of an ES is semi-automated and pre-formatted by selecting the relevant activity and launching an ES template via the new ES icon (this action requires editor or higher access). This feature replaces the previous word document template but provides the same basic information—the organisation’s perspective on an activity, key issues and recommendations for their resolution, and a short conclusion. The ES provides a quick overview, supported by document links to key issues and resolutions where further details are required. There is no requirement to list background information such as references, activity aim and assigned forces as this information is already provided on the ADFAADS activity page.

14. An ES template example is provided in the ADFAADS help section. Guidance is provided within the template by clicking in each field. The following notes summarise the content of the ES:

a. Organisation overview. Provides an overview view of the headquarters/unit/organisation’s participation in the activity including a brief assessment of the achievement or otherwise of activity aims and objectives. It identifies the major strengths and weaknesses of the activity as they relate to the organisation.

b. Summary of global issues and resolutions. This function lists the title and summary of key issues and their resolutions, with a document link to the detailed supporting documents. This is obtained by selecting the key global issues and resolutions (the issues include only those raised by the originating organisation that have been released to global).

c. Exercise planning and administration (exercises only).This function comments on the effectiveness or otherwise of exercise planning and administration. Significant issues should be raised as separate and summarised ADFAADS issues. These may include issues such as factors affecting realism, control methods, suitability of exercise areas, time constraints, adequacy of planning documents, adherence to combined and joint doctrine, and out-of-exercise support.

10A–5

Page 102: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

d. Conclusion. The conclusion summarises any findings, outcomes or lessons learned that may be relevant to future activities.

Duties of lead planning organisation 15

15. The lead planning organisation for an activity must ensure an activity page is raised and maintained on ADFAMS. ADFAMS automatically generates an ADFAADS activity page for activities listed on the Program of Major Service Activities; however, ADFAADS managers usually need to add additional information. For non-ADFAMS activities, the lead planning organisation is responsible for raising an activity page on ADFAADS. The local ADFAADS manager needs to oversee this but should have designated activity creators and activity editors to assist (this can be set up by editing the access list).

16. The lead planning organisation should review previous issues/resolutions, monitor the current activity issues/resolutions as the activity progresses, and ensure the ES from key participants have been posted within the nominated period of activity end (usually 30 days). There is also an implied responsibility to ensure related resolutions are staffed through to the resolution complete stage.

17. The lead planning organisation should also contribute to the achieved outcomes section of the ES. They should effectively sign off against the activity by adding a summary of achievements measured against the aim, objectives, analysis objectives and desired outcomes.

Duties of operational evaluation team 18

18. Internal evaluation of activities may be augmented by OE conducted by an OE team drawn from external sources and directed by a superior commander. OE teams follow the ADF OE system and utilise the ADFAADS database in the same way that internal OE personnel raise issues, recommend resolutions and submit activity reports.

19. The OE team develops an evaluation plan with specific objectives and conditions, standards and measures to meet their focus requirements. The team needs to comprise appropriate subject matter experts able to produce a comprehensive ES within the directed time frame. Once complete, all stakeholders should be emailed a link to the ES to distribute the OE product. Although not directly responsible for action, the OE team should continue to monitor ongoing issue resolution staff action.

10A–6

Page 103: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Leveraging knowledge 20

20. Knowledge is captured and applied through the recording of issues/resolutions in ADFAADS. Because each activity is unique with varied challenges, the ADFAADS mangers and key evaluation staff must keep the perspective of how access to previous lessons and the building of the knowledge capital enables improved command and control and better decision making. Accessing the knowledge capital and the exchange of operationally relevant information is facilitated through the ADFAADS functions of text search, report generator and agents.

Search facility 21

21. A comprehensive search facility is available via the binocular icon on activity pages. Although not exactly the same as the perhaps more familiar internet search tool, this Lotus Notes facility performs similar functions including a full text search of embedded documents. Apart from finding specific items, the search facility can be used for research. Although ADFAADS is still maturing, there is a significant amount of knowledge already available to make the database a powerful research source.

Report generator 22

22. This facility provides a significant capability for those wishing to utilise the knowledge contained within ADFAADS. The report generator enables the researcher to select any combination of report fields within any combination of the forms to conduct a search. The researcher then determines the content of the search report by checking any combination of the fields within any combination of the forms. The report is presented automatically in a Microsoft Excel spreadsheet where it can be manipulated utilising the inherent Excel features. The report can also be copied to Microsoft Word or Power Point to support or substantiate other staff work.

Status of actions and reporting/action agents 23

23. The status of the actions and reporting/action agents’ functions is designed to assist the ADFAADS user to progress their work, and to provide a management tool for commanders. By accessing the status of actions button (on the blue navigator), outstanding actions for the local organisation will be presented, providing the commander with a useful update on the status of actions within their organisation. Status of actions also provides a sub-menu detailing completed actions. The final status of the actions option is current reviewer/action officer, which allows individual users to see all the resolutions and issues for which they are the current action/reviewing officer—in essence their ADFAADS in-tray.

10A–7

Page 104: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

24. The reporting/action agents take two forms:

a. A scheduled reporting agent produces a variety of monthly reports to each local ADFAADS manager on resolutions completed and issues released to global, and each individual action officer receives a report/reminder of their outstanding actions.

b. The action agent reports on draft issues and resolutions, and can be run by all users via the actions/report buttons in the default view.

Statistical analysis 25

25. This feature is currently only available to ADFAADS managers and the ADFAADS administrator due to its inherent demands on memory and potential level of use. It may be made more widely available in the future if the demand is evident, in the meantime a user should contact their local ADFAADS manager if a report is required. Statistical analysis is a very useful management tool that provides an overview of the status of the entire database, down to individual organisations and activities. It is accessed via the Lotus Notes system menu create/statistical analysis and by then selecting the various parameters. The results are presented in an Excel spreadsheet where the usual Excel features such as graphical display can be utilised.

Mobile Australian Defence Force Activity Analysis Database System 26

26. ADFAADS has the ability to create a stand-alone database that can be hosted on a laptop or a local area network (LAN) (other then the Defence Secret Network (DSN)). Manager access is required to produce mobile ADFAADS. Although DSN reach and connectivity is continually improving, there are still a significant number of tactical units deploying without DSN connectivity. These units need merely export an abridged copy of ADFAADS for their activity and whilst deployed draft issues and resolutions in the correct and functional format. On return to base, the completed mobile ADFAADS data can be imported directly back into the main database. Draft issues/resolutions are automatically formatted and ready for global release. This feature can also be used within bases/barracks that do not yet have DSN connection. It is also possible to use the Defence Restricted Network (DRN) to temporarily host an exported mobile ADFAADS (it may be necessary to activate Lotus Notes), exporting it back into the DSN on completion of the activity. Note that for security and protocol reasons this can only be a local use of the DRN1.

1 The ADF does not plan to host a Restricted version of ADFAADS on the DRN owing to management overhead.

10A–8

Page 105: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

27. Creating mobile Australian Defence Force Activity Analysis Database System. ADFAADS can be replicated in part as a stand-alone mobile database that can be hosted on a laptop or non-DSN LAN (edit access by managers and activity editors only). The completed mobile ADFAADS can be seamlessly imported back into the main database. To create a mobile ADFAADS follow the process below:

a. Select actions from the main toolbar and then create mobile ADFAADS from the drop-down.

b. Type in the name of the mobile ADFAADS in the dialogue box and select the security classification of the database and confirm the classification in the following dialogue box (select according to the level of security that can be applied to the laptop or LAN).

c. Select the position names of the personnel who will create documents in the database from the address lists which open.

d. The database will then be created. Note that this process may overwrite any existing mobile ADFAADS databases.

e. Managers/activity editors will then be asked to add the activities that they want to include in the mobile database (select one or more activities as required).

f. On completion the database will be created in the local Notes directory (H: Drive). The local DSN administrator will need to transfer the file off the system to the laptop.

28. Once draft issues have been completed the mobile database can be imported back into ADFAADS as follows:

a. Have the mobile database file taken from the laptop and placed on the DSN in the local drive by the local administrator.

b. Select actions on the toolbar and then import mobile ADFAADS from the drop-down menu.

c. Select the file name of the mobile ADFAADS database.

d. Confirm the actions in the dialogue box and the database will be imported back into the ADFAADS database.

e. The draft issues and resolutions will now only require final editing/review by the drafter, and release to global by the appropriate releasing officer using the normal process.

10A–9

Page 106: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

Integration of Australian Joint Essential Tasks 29

29. ASJETS are being used to standardise ADF activity objectives and provide linkages back to strategic response options and operational preparedness objectives listed in the Joint Operations Command Operational Preparedness Requirement. ASJETS data is under development, however, the existing data has been integrated into ADFAMS and ADFAADS to facilitate further development and assist in the OE process. ASJETS are an important development from an OE perspective as they align activity and OE objectives and through specified conditions, performance measures and standards provide a basis for objective OE. The use of ASJETS should improve OE practices by augmenting the existing ADFAADS system which in the past has provided relatively subjective assessments.

Re-submission of resolutions 30

30. ADFAADS allows for re-submission of issues, thereby providing systemic support for knowledge transfer. For example, a planner of an exercise that is run every two years might resubmit the major resolutions from the 2007 running of the exercise to the next officer who will be planning the 2009 iteration of the exercise. Re-submissions can be directed to any number of appointments at any time and in relation to any activities (note that an appointment title rather than an individual’s name must be selected). On the nominated date an automatic email with document links will be sent to the selected appointments. Individuals can view their re-submissions at any time via the status of actions and re-submits menu. This is an excellent tool to compile staff reminder notes, which can then be built into handover notes. ADFAADS managers can view all re-submissions for their organisation and may further allocate re-submissions to other personnel within the organisation.

Responsibilities of commanders 31

31. Whilst the ADFAADS manager has responsibility to conduct day to day management of the database, the ultimate value of the system to each organisation will depend on the higher level direction and overall management provided by the commander. Managers at the staff officers grade one and two levels are usually able to actively facilitate the use of ADFAADS within their organisation. However, experience has shown that a lack of command direction and mandate can result in managers and staff not devoting the appropriate time and effort to ADFAADS. The publication of local ADFAADS SOP that promulgate ADFAADS responsibilities and processes is an appropriate means for commanders to direct their requirements.

10A–10

Page 107: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

32. There are a number of higher-level management features within ADFAADS to assist commanders. Integrated sorting choices provide a commander’s summary of issues, and as detailed above the agents and statistical analysis functions can be used by commanders to monitor the staffing progress within their command.

Duties of Australian Defence Force Activity Analysis Database System managers 33

33. The diligence and enthusiasm of ADFAADS managers will greatly affect the success or otherwise of ADFAADS utility, application and benefits to their organisation and the ADF. ADFAADS managers usually have other primary duties, and require the support and direction of their commanders if they are to be effective. ADFAADS managers should conduct ADFAADS management tasks daily, including general business, managing local issues, providing advice and conducting training.

34. Managing ADFAADS is a useful adjunct to normal activities conducted by plans, operations and analysis staffs, where most ADFAADS managers reside. ADFAADS managers should perform the following tasks:

a. provide or arrange ADFAADS training;

b. provide local advice;

c. ensure ADFAADS is properly set up with appropriate access levels;

d. actively encourage ADFAADS use before, during and after activities;

e. oversee the creation and editing of activity pages for which their organisation is responsible;

f. oversee the creation of ES;

g. oversee the actions arising from resolutions;

h. provide overall management, advocacy and advice to the command, and alert their commanders to ADFAADS usage levels; and

i. determine how the organisation can use ADFAADs more effectively.

10A–11

Page 108: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and
Page 109: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

GLOSSARYcapability development

The function of developing Defence capability, preparedness, capability priorities and plans, as well as providing guidance to the capability output managers, and procurement function. Includes concepts, strategies and plans for development and mobilisation of the nation's support to sustain the Defence effort.

dataa. Facts arranged in a formal manner, suitable for

communication, transmission, interpretation or processing.b. The content of communication without the structure or context

that makes it meaningful and accessible. For example, a table of figures with no indication to what the figures refer.

force element (FE)A component of a unit, a unit or an association of units having common prime objectives and activities.

force in being (FIB)The current state of the planned force structure which is represented by the Australian Defence Force (ADF) as it currently exists.

force structureForce structure relates to the type of force-required personnel, equipment, facilities and military doctrine to achieve the level of capability necessary to conduct operations effectively. In the medium term to the long-term, military capability will vary due to changes in force generated by the capability development progress. In the short-term, force structure is the more constant component of military capability and the level of capability available for operations is determined by the Defence’s management preparedness of the current force. Changes to force structure usually affect the preparedness of the associated forces. For example, the introduction of a new platform, retirement of an old platform or capability enhancement will a have a direct impact on the resource, training and facility requirements of the forces involved.

fundamental inputs to capability (FIC)The effective generation of a capability consists of a range of inputs. To ensure consistency, these have been consolidated as the FIC. The FIC are a guide that may be used to quantify capability. The eight FIC are organisation, personnel, collective training, supplies, facilities, major systems, support and command and management.

Page 110: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

jointConnotes activities, operations, organisations, etc in which elements of more than one Service of the same nation participate. When all Services are not involved, the participating Services shall be identified, eg Joint Army-Navy.

knowledge capitalOur knowledge. The codified experience and knowledge in Defence recorded over time. This includes data and information in databases and observations and lessons from the conduct of Defence business (both during operations and in peace) that is recorded as history, philosophy, doctrine, policy and procedures.

knowledge managementKnowledge management is the explicit control and management of knowledge within the organisation aimed at achieving the organisation’s objectives. It aims to improve the performance of processes, organisation and systems from the perspective that knowledge is thereby the crucial production factor.

minimum level of capability (MLOC)MLOC is the lowest level of capability (task-specific) from which a FE can achieve its operational level of capability within readiness notice, and it encompasses the maintenance of core skills, safety and professional standards.

Network Centric Warfare (NCW)The style of operations that can be undertaken by a networked force where the automatic and rapid transfer of information enables the most effective use of combat power and takes place when the force can operate as a single virtual network.

officer scheduling the exercise (OSE)The officer who originates the exercise and orders it to take place. They will issue basic instructions which will include the designation of exercise areas, the allocation of forces and the necessary coordinating instructions. They will also designate the officers conducting the exercise.

operational analysis (OA)The use of mathematical, statistical and other forms of analysis to explore situations and help decision makers to resolve problems. Facts and probabilities are processed into manageable patterns relevant to the likely consequences of alternative courses of action.

2

Page 111: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

operational evaluation (OE)Is the systematic approach to place value on identifying lessons and recommendations to enhance the ADF’s capability and preparedness. [Definition subject to approval]

operational level of capability (OLOC)OLOC is the task-specific level of capability required by a force to execute its role in an operation at an acceptable level of risk.

operational preparedness objective (OPO)OPO are the specific tasks, readiness notice and sustainability requirements assigned to specific FE via the Joint Operations Command Operational Preparedness Requirement. Joint Operations Command may have several FE options to implement a particular OPO. They equate to each of the Australian military response options (AMRO) (in essence they are populated AMRO).

preparednessPreparedness is a measurement of how ready (readiness) and how sustainable (sustainability) the whole or part of the ADF is to undertake military operations. The readiness of forces to be committed to operations within a specified time is dependent on the availability and proficiency of personnel, equipment, facilities and consumables. Sustainability is measured in terms of the ability to provide personnel, equipment, facilities and consumables to enable a force to complete its period of operations.

preparedness management system (PMS)The PMS is designed to provide a mechanism for the translation of Government strategic guidance into detailed directions used by the Outcome Executives in producing outputs for Government. It includes the guidance on the roles and tasks required in producing these outputs and the levels of activity (preparedness) that are to be maintained within the specified budget. As such the system provides a basis for detailed guidance in the allocation of resources to the Outcome Executives and Enabling Groups to achieve required levels of preparedness and works in parallel with Defence Capability Planning Guidance to inform the management and development of current and future capability. The system evolves through an incremental process that has four phases, development, implementation, reporting and review.

qualitative dataA data value that is a non-numerical description of a person, place, thing, event, activity, or concept.

3

Page 112: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

quantitative dataNumerical expressions in decimal form that use Arabic numbers, upon which mathematical operations can be performed.

readiness notice (RN)Readiness notice is the specified amount of time in which a force is to complete its work up from MLOC to OLOC.

4

Page 113: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

ACRONYMS AND ABBREVIATIONSADDP Australian Defence Doctrine PublicationADF Australian Defence ForceADFAADS ADF Activity Analysis Database SystemADFAMS ADF Activity Management SystemADFP Australian Defence Force PublicationADFWC ADF Warfare CentreASJETS Australian Joint Essential Tasks

CAL Centre for Army LessonsCJOPS Chief of Joint OperationsCOA course of action

DCJOPS Deputy Chief of Joint OperationsDI(G) Defence Instruction (General)DLOC directed level of capabilityDRN Defence Restricted NetworkDSN Defence Secret Network

ES executive summary

FE force elementsFIB force in beingFIC fundamental inputs to capability

HQAC Headquarters Air Command

JMAP joint military appreciation processJOCOPR Joint Operations Command Operational

Preparedness Requirement

LAN local area network

MOAC Maritime Operational Analysis Centre

NCW Network Centric Warfare

OA operational analysisOE operational evaluationOEPB operational evaluation preparation of the

battlespaceOLOC operational level of capabilityOPO operational preparedness objectivesOSE officer scheduling the exercise

PAR post activity reportPMIS Preparedness Management Information SystemPMS preparedness management systemPMSA Program of Major Service Activities

5

Page 114: EXECUTIVE SERIES ADDP 00.4 OPERATIONAL EVALUATION€¦ · 1.4 Evaluation has become a well established discipline that has been described internationally with sets of standards and

ADDP 00.4

RAN Royal Australian NavyRN readiness notice

SOP standing operating procedures

UN United Nations

VCDF Vice Chief of the Defence Force

6