pilot validation methodology for agent-based simulations workshop where are we?
DESCRIPTION
Pilot Validation Methodology for Agent-Based Simulations Workshop WHERE ARE WE?. Dr. Michael Bailey Operations Analysis Division Marine Corps Combat Development Command 01 October 2007. WHO ARE WE?. Analysts Developers Accreditors of Simulations Planners, Trainers, Experimenters - PowerPoint PPT PresentationTRANSCRIPT
Pilot Validation Methodology for Agent-Based Simulations
Workshop
WHERE ARE WE?
Dr. Michael Bailey
Operations Analysis DivisionMarine Corps Combat Development Command
01 October 2007
WHO ARE WE?
• Analysts
• Developers
• Accreditors of Simulations
• Planners, Trainers, Experimenters
• Academics
BRIEF RECAP
• Can I use this ABS to support the Scientific Method?
• Trips around the “O-Course”– What’s an Agent-based Simulation?
• produces surprises, emergent behavior• focus on Irregular Warfare applications
– What is Validation?• PROVIDE SUPPORT TO ---- Is the simulation
useful in answering the analytical question?
FINDINGS
• Overall validation framework
• Decomposition of the process of simulation development
• Basis for declaring a simulation inappropriate
• Framework for analysis
• Matching analysis and simulation
THIS WORKSHOP
• Present the framework
• Attempt to apply it
• Learn in the process
Your contributions are critical to our success
THANKS FOR COMING!
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Introduction to the Pilot ABS Validation Methodology
Mr. Edmund Bitinas
Northrop Grumman
01 October 2007
Agenda
• Goals of Framework & Desired Result• What Is Missing From Current V&V Process• Theory of Validation• Lunch• Framework for Validation• Sample Methodology Approach• Break• Open Discussion
Focus of Pilot Framework
• Applicable to all models/simulations
• Specifically developed for agent-based simulations (ABSs) and irregular warfare (IW) applications
Types of Model Validation
• Expected value, physics-based simulations– Verifiable through experimentation– Random effects introduce predictable error
• Stochastic, probability-based models– Distribution of model outcomes matches the
distribution of observed outcomes• Student T test, others
– Model-generated and observed distributions are identical if they cannot statistically be proven otherwise
• Probability of being correct
ABS Validation Limitations
• Agents attempt to replicate, at least in part, the human decision making process– Humans may have more information than the
Agents– Humans may include emotions and experience– Humans may think/plan ahead– Humans may anticipate the actions of others– Two humans, given the same information, may
make different decisions
• Thus, traditional validation may not be meaningful
Additional Complications
• Traditional models can be validated for a class of problems– e.g., Campaign models
• Some ABSs are not models of anything in particular– Agent behaviors and capabilities are assigned by
the user, via input, for a specific application– The software merely executes the input behaviors– Examples: Pythagoras, MANA, others– Question: How much behavior can be/needs to be
reproduced?
What Constitutes ABS Validity?
• An ABS may be valid:– For a specific application– Over a limited range of inputs– If the decisions it makes could be made in real life– If the emerging complex behavior can be traced to
a realistic root cause(s)
• But, an ABS is NOT valid if one can prove that it is invalid– Trying to invalidate an ABS for an application
(and failing to do so) may result in lower risk in using the ABS for the application
Framework Goals• Determine the required accuracy
– What is sufficient accuracy for the intended application?
• Find techniques for uncovering invalid models– Validation may not be possible
• Establish the boundaries of validity– May limit applicability to only a portion of the
intended use
• Ensure the process is not resource intensive• Accomplish the process with a small fraction
of total resources available for the application
Desired Result
• Develop a framework process that is:– Transparent– Traceable – Reproducible– Communicable
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Theory of Validation
Dr. Eric Weisel
WernerAnderson, Inc.
01 October 2007
Basic Questions in Simulation Science
• What is simulation?– Basic structures– Properties of those structures
• How is a simulation related to other … ? – Abstraction– Validity– Fidelity …
• Objective: Useful theorems about simulation
Objectives of Simulation Science
• Useful theorems about simulation– Properties of structures– Capabilities and limitations of simulation– Are there systems which cannot be
simulated– Time complexity– Interoperability and composability
19
Objectives of Simulation Science
• Foundational sciences– Mathematics– Computability theory– Logic– Model theory– Systems theory
• Not reinventing basic structures of foundational sciences
20
Objectives of Simulation Science
• Common approach to feasibity is to try to build it
• A better way – build useful theorems about simulation– Properties of structures– Capabilities and limitations of simulation– Time complexity– Interoperability and composability
21
Survey of Theoretical Framework
• Model
A model is a computable function
where
S is a non-empty set of states,I is a set of inputs, andO is a set of outputs,
are vectors of integers.
YXM :
ISX OSY
Ss Ii Oo
A model is a physical, mathematical, or otherwise logical representation of a system, entity, phenomenon or process (Department of Defense 1998)
22
Survey of Theoretical Framework
• Simulation Simulation is a method for implementing a model over time. Simulation also is a technique for testing, analysis or training in which real world systems are used, or where a model reproduces real world and conceptual systems. (Department of Defense 1998).
s0 s1M
o1
i0
s2M
o2
i1
sk-1. . .
skM
ok
ik-1
sn-1. . .
snM
on
in
s0 s1M
o1
i0
s2M
o2
i1
sk-1. . .
skM
ok
ik-1
sn-1. . .
snM
on
in
s0 s1 s2 sn
i1i0 i2 In-1
…MM M M
s0s0 s1s1 s2s2 sn
i1i0 i2 In-1
…MM M M
23
A labeled transition system (LTS) is a tuple defined by
whereS is a non-empty set of states,Σ is a set of labels, and
is the transition relation
,, ΣST
s1 s2 s3
2 4
3
1
s1 s2 s3
2 4
3
1
Survey of Theoretical Framework
• Labeled Transition System
24
Simulation is the sequential execution of a model and is represented by a deterministic labeled transition system
where
M is a model,MS is the state model of M
SMISML ,,
SMIsSsML ,,,, 00
Survey of Theoretical Framework
• Simulation
25
Comparison of transition systems
s*1 s*2 s*3
a b
s*4 s*5
ca
s1 s2 s3 s4 s5
RF
F* F*F* F*
a b ca
F FF F
RF RFRF RF
s*1 s*2 s*3
a b
s*4 s*5
ca
s1 s2 s3 s4 s5
RFRF
F* F*F* F*
a b ca
F FF F
RFRF RFRFRFRF RFRF
Model behaves in a similar way to a natural system
26
Comparison of transition systems
Simulation is one-way bisimulation
means TM simulates TI (TM is valid)
Let and be labeled transition systems. A relation is a weak bisimulation if and only if for all ,
means
,,1 ΣPT ,,2 ΣQT
QPR R, qp Σσ
andintheninif 2
ˆ
1 TqqTpp
and, somefor R, Qqqp
andintheninif 1
ˆ
2 TppTqq
. somefor R, Ppqp
ss ̂
ssji
27
Simulation Relations
• Equivalence
s*1 s*2 s*3
a b
s*4 s*5
ca
s1 s2 s3 s4 s5
RF
F* F*F* F*
a b ca
F FF F
RF RFRF RF
s*1 s*2 s*3
a b
s*4 s*5
ca
s1 s2 s3 s4 s5
RFRF
F* F*F* F*
a b ca
F FF F
RFRF RFRFRFRF RFRF
A relation is an equivalence relation if and only if
QPR
R, pp
R,R, pqqp
R,R,andR, rprqqp
28
Simulation Relations
• Metrics0*
s0 s1 s2
sk
s1* s2
* sk*F*
ba
a
ba a
F
…
022
1 EE kkkkk AAA 0
2122 EE AA 011 EE A0E
s0*s0*
s0s0 s1s1 s2s2
sksk
s1*s1* s2
*s2* sk
*F*
ba
a
ba a
F
…
022
1 EE kkkkk AAA 0
2122 EE AA 011 EE A0E
A metric is a function Z+ satisfying
A relation is an metric relation withparameter δ if and only if for all ,
QPR
R, qp qpu ,
QPu :
0, ppu
pquqpu ,,
qpqpu 0,
rpurquqpu ,,,
29
Composition of Models
• Validity of composition of modelss*1 s*3
ab
s*5
s*1 s*2 s*3
a b
s*4 s*5
ca
s1 s2 s3 s4 s5
VG
ac
V* V*V*
G* G*F* F*
F*G* F*G*
V
s1 s3
ab
s5
ac
FG FG
a b ca
G GF F
VG VF VF VG VG VF VF
V V
VFG
s*1 s*3
ab
s*5
s*1 s*2 s*3
a b
s*4 s*5
ca
s1 s2 s3 s4 s5
VGVG
ac
V*V* V*V*V*V*
G* G*F* F*
F*G* F*G*
VV
s1 s3
ab
s5
ac
FG FG
a b ca
G GF F
VG VFVG VF VF VGVF VG VG VFVG VF VFVF
VV VV
VFG
VFG
Show that a simulation relation exists for composition of valid models
30
Composition of Models
Model Relation
Linear Affine Algebraic Elementary Computable
Equivalence Yes Yes Yes Yes Yes
Step metric
Yes Yes No No No
Trajectory metric
Conditional Conditional No No No
There may be surprises in the underlying theory at foundation
of simulation
31
Verification and Validation Summary
• Validation– There are three key elements embedded
within the U.S. DoD validation definition:(1) accurate(2) real world(3) intended use
Validation is the process of determining the degree to which a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model.(DODD 5000.1 and DODI 5000.61)
The V&V Continuum
We want to have confidence (or lack thereof) that our model represents the “real world”
2 paths to get there
Conjecture: demonstrating mathematically thatis intractible at best undecideable at worst
Since we can’t prove validity, we must rely on the scientific method to build confidence/assess risk
Validation question:1.Assessment of risk of Type II error in application of scientific method2.Null hypothesis:
Theory Supports Framework
Abstraction (ideal sim) and simulation relation (R) capture formal representation of intended/specific use
Matching Tool to Application
The Road Ahead:Build classes of models and simulation relations such that:
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Ten Minute BREAK
1100 - 1110
Please Return To Auditorium
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Framework for Validation
Dr. Eric Weisel and Ms. Lisa Jean Moya
WernerAnderson, Inc.
01 October 2007
Typical AgentDMSO, VV&A Recommended Practices Guide – Human Behavioral Representation (HBR) Special TopicMoya & Tolk, Toward a Taxonomy of Agents & MAS
Communication
Reasoning / Decision-making
Reactivity GoalsPerc
eptio
n
Agent
Beliefs Memory
Actio
n
Agent Agent
Agent AgentCommunicating
Sphere of influence
Environment
Typical Definitions• U.S. DoD: Validation is the process of determining the degree to which
a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model.
• U.K. Ministry of Defence: Validation – To establish that the model / process is fit for purpose
• ASME / AIAA: The process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended use of the model.
• DOE: The process of determining the degree to which a computer model is an accurate representation of the real world from the perspective of the intended model applications
• IEEE: The process of evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements.
Three Main Elements
• The model• The thing being simulated
– “Real world”– Empirical data– Referent– Abstraction
• Set of bounding principles– Accuracy requirements– Intended use
Three main elements
• The model• The thing being simulated
– “Real world”– Empirical data– Referent– Abstraction
• Set of bounding principles– Accuracy requirements– Intended use
U.S. DoD: Validation is the process of determining the degree to which a model and its associated data are an accurate representation of the real world from the perspective of the intended uses of the model.
Problem Solving Process
Define problem
Establish objectives
Define problem
Establish objectives
Accept & record
Analyze results
Accept & record
Analyze results
Repository
Select approaches
Select approaches
Non-M&S methodsNon-M&S methodsApply resultsApply results
Execute & prepare results
Execute & prepare results
Makeaccreditation
decision
Makeaccreditation
decision
Prepare M&S for
use
Prepare M&S for
use
M&S MethodM&S Method
Define M&S reqmts
Plan approach
M&S MethodM&S Method
Define M&S reqmts
Plan approach
M&S Use Process
Accreditation Process
Develop accreditation
plan
Develop accreditation
plan
Perform accreditation assessment
Perform accreditation assessment
Collect and evaluate accreditation informationCollect and evaluate accreditation information
Verify reqmts
Verify reqmts
Develop V&V plan
Develop V&V plan
Perform V&V activities appropriate for M&S categoryPerform V&V activities appropriate for M&S category
Construct Federation
Determine Fed Reqmts
Determine Fed Reqmts
Plan Fed Construction
Plan Fed Construction
Develop & Test
Design
Develop & Test
Design
Integrate & Test Fed
Integrate & Test Fed
Develop Fed Conceptual
Model
Develop Fed Conceptual
Model
Develop New M&S
Determine M&S Reqmts
Determine M&S Reqmts
Plan M&S DevelopmentPlan M&S
DevelopmentDevelop
Conceptual Model
Develop Conceptual
ModelImplement &
Test M&SImplement &
Test M&S
Develop & Test
Design
Develop & Test
Design
Modify Legacy M&S
Plan Modifications
Plan Modifications
Modify Conceptual
Model
Modify Conceptual
ModelDetermine Mod Reqmts
Determine Mod Reqmts
Implement & Test M&S
Mods
Implement & Test M&S
Mods
Modify & Test Mod Design
Modify & Test Mod Design
M&S Development & Preparation Process
Construct Federation
Determine Fed Reqmts
Determine Fed Reqmts
Plan Fed Construction
Plan Fed Construction
Develop & Test
Design
Develop & Test
Design
Integrate & Test Fed
Integrate & Test Fed
Develop Fed Conceptual
Model
Develop Fed Conceptual
Model
Construct Federation
Determine Fed Reqmts
Determine Fed Reqmts
Plan Fed Construction
Plan Fed Construction
Develop & Test
Design
Develop & Test
Design
Integrate & Test Fed
Integrate & Test Fed
Develop Fed Conceptual
Model
Develop Fed Conceptual
Model
Develop New M&S
Determine M&S Reqmts
Determine M&S Reqmts
Plan M&S DevelopmentPlan M&S
DevelopmentDevelop
Conceptual Model
Develop Conceptual
ModelImplement &
Test M&SImplement &
Test M&S
Develop & Test
Design
Develop & Test
Design
Develop New M&S
Determine M&S Reqmts
Determine M&S Reqmts
Plan M&S DevelopmentPlan M&S
DevelopmentDevelop
Conceptual Model
Develop Conceptual
ModelImplement &
Test M&SImplement &
Test M&S
Develop & Test
Design
Develop & Test
Design
Modify Legacy M&S
Plan Modifications
Plan Modifications
Modify Conceptual
Model
Modify Conceptual
ModelDetermine Mod Reqmts
Determine Mod Reqmts
Implement & Test M&S
Mods
Implement & Test M&S
Mods
Modify & Test Mod Design
Modify & Test Mod Design
Modify Legacy M&S
Plan Modifications
Plan Modifications
Modify Conceptual
Model
Modify Conceptual
ModelDetermine Mod Reqmts
Determine Mod Reqmts
Implement & Test M&S
Mods
Implement & Test M&S
Mods
Modify & Test Mod Design
Modify & Test Mod Design
M&S Development & Preparation Process
Y
N
V&V Process
Verify M&S requirements Develop V&V plan Validate conceptual model Verify design Verify implementation Validate results
Basic representation
Effect of interactions
Empirical Assessment• Another model
• Mathematical• Simulation• Formalism
• Historical event• Live experiment
• SME / Turing• Statistical• Metric
Assessment• Appropriate referents• Rule set (alone & in
the composition)• Instantiation• Interpretation• Trajectory
Adapted from DMSO, VV&A Recommended Practices Guide
Physics based modeling
• Conceptual model validation– Mathematical equations– Difference equation solution algorithm
• Results validation– Empirical data– Experimental testing– Predictive capabilities– Acceptable error tolerance
Agent ValidationDMSO, VV&A Recommended Practices Guide – Human Behavioral Representation (HBR) Special TopicMoya & Tolk, Toward a Taxonomy of Agents & MAS
Communication
Reasoning / Decision-making
Reactivity GoalsPerc
eptio
n
Agent
Beliefs Memory
Actio
n
Little empirical data Evaluate
Conceptual model design Knowledge Base Engine and Knowledge
Base implementation Integration with
simulation environment
CompareCompare
Abstraction
Elements of Interest• Detail• Fidelity• Resolution
Experimental Frame
Abstraction
Elements of Interest• Detail• Fidelity• Resolution
Experimental Frame
• Analytic models• Formulas• Parameter sets• State rule sets• Update methods
1k k ks S f s s S
Mathematical Model
• Theory• Opinion• Experience
Relationships between elements• If x then y•••
x
E f x
1 2 1 2x x E f x E f x
TheoreticalModel
Conceptual Model
• Analytic models• Formulas• Parameter sets• State rule sets• Update methods
1k k ks S f s s S
Mathematical Model
• Theory• Opinion• Experience
Relationships between elements• If x then y•••
x
E f x
1 2 1 2x x E f x E f x
TheoreticalModel
• Analytic models• Formulas• Parameter sets• State rule sets• Update methods
1k k ks S f s s S
Mathematical Model• Analytic models• Formulas• Parameter sets• State rule sets• Update methods
1k k ks S f s s S • Analytic models
• Formulas• Parameter sets• State rule sets• Update methods
1k k ks S f s s S
Mathematical Model
• Theory• Opinion• Experience
Relationships between elements• If x then y•••
x
E f x
1 2 1 2x x E f x E f x
TheoreticalModel
• Theory• Opinion• Experience
Relationships between elements• If x then y•••
x
E f x
1 2 1 2x x E f x E f x
Relationships between elements• If x then y•••
x
E f x
1 2 1 2x x E f x E f x
TheoreticalModel
Conceptual Model
h = -16t2 + vt + s
/* Height of an object moving under gravity. *//* Initial height v and velocity s constants. */main(){
float h, v = 100.0, s = 1000.0;int t;for (t = 0, h = s; h >= 0.0; t++){
h = (-16.0 * t * t) + (v * t) + s;printf(“Height at time %d = %f\n”, t, h);
}}
Algorithmic Modelh = -16t2 + vt + s
/* Height of an object moving under gravity. *//* Initial height v and velocity s constants. */main(){
float h, v = 100.0, s = 1000.0;int t;for (t = 0, h = s; h >= 0.0; t++){
h = (-16.0 * t * t) + (v * t) + s;printf(“Height at time %d = %f\n”, t, h);
}}
Algorithmic Model
Instantiated Model
Settings, Data, & Results
Instantiated Model
Settings, Data, & Results
Settings, Data, & Results
DataData
Real WorldReal World
…
…
……
CompareCompareBuildBuild
Spiral Methodology for Invalidating an ABS
• Assess risk of using the ABS for the specific/ intended use– Specific use = Applying the ABS for a specific purpose
• user-centric– Intended use = Developing the ABS for a specific reason
• developer-centric
• Communicate that risk to the consumer of the results of the ABSVal process
• Apply scientific method using invalidation techniques– Ideally performed at each step in process
– Realistically, given resource constraints, conduct cost-benefit tradeoff to determine techniques that:
1. Will invalidate the ABS quickly, or
2. Will provide a significant reduction in risk in using the ABS for the specific/intended use
Application of Scientific Method• Apply:
– Invalidation techniques with highest cost-benefit tradeoffs– Apply additional invalidation techniques as resources allow – Each technique will add to or subtract from the level of risk – Communicate reasoning behind techniques chosen and areas of
process chosen for application of the scientific method• If null hypothesis rejected at any point, the ABSVal process
is done– Assuming the reason for rejection cannot be easily fixed or modify
use• If null hypothesis not rejected, some decreased degree of
risk can be conveyed to the consumer• If null hypothesis not rejected, but the ABSVal performer
does not have a high degree of confidence in the validity of a given piece– The ABSVal performer can attempt to use another technique to
invalidate that particular piece, and/or – Can convey a higher level of perceived risk to the consumer
The V&V Continuum
We want to have confidence (or lack thereof) that our model represents the “real world”
2 paths to get there
Conjecture: demonstrating mathematically thatis intractible at best undecideable at worst
Since we can’t prove validity, we must rely on the scientific method to build confidence/assess risk
Assessment of Risk• Based on Utility Theory• Ti = Technique of validation• For each {Ti}, have a Risk of
Type II Failure R({Ti}) and a Cost C({Ti})
• For each Ti:– Impact of Type II Failure:
• I (Ti) ~ VI(I)
– Likelihood of Type II Failure: • L (Ti) ~ VL(L)
– Risk of Type II Failure: • R {Ti} = f(I, L)
• R{Ti} = wIV(I) + wLV(L)
Impact
High
HighLow
Low
Likelihood of Failure
Unacceptable
Very High
High
Some
Acceptable
Low
Negligible
Risk Of Using The ABS
Impact
High
HighLow
Low
Likelihood of Failure
Unacceptable
Very High
High
Some
Acceptable
Low
Negligible
Risk Of Using The ABS
Impact of Type II
Failure
Likelihood of Type II Failure
Ris
k o
f T
ype
II F
ailu
re
Communicating Risk• By communicating the level of perceived risk in an
ABS after failing to invalidate it, the consumer is provided with a means for assessing the ABS’s applicability to hard-to-quantify, non-traditional areas or activities, such as Irregular Warfare (IW)– The consumer can make an informed decision on whether to
use the ABS for a given specific purpose/intended use given• The fact that the ABS was not proven invalid• The number and type of techniques that were applied at each
step in the process, and • The degree of risk that the ABSVal performer perceives
– It is important to note that the ABSVal performer is not communicating that the ABS is valid – but rather that the ABS was not proven to be invalid
• The ABSVal performer is providing sufficient evidence that supports that the ABS is adequate for the specific/intended use
Invalidation Techniques– Executable compared to concept/referent
• Results validation• Mini-analysis• Accuracy
– Theoretical model compared to concept/referent
• Assumption testing• SME Review
– Mathematical model compared to theoretical model
• Boundary analysis • Algorithm review• Spreadsheet Modeling
– Software code compared to mathematical model
• Existence of required outputs• Symbolic debugger• Code walk through
– Data compared to executable
• Existence of required inputs
• Comparison to other models• Turing test• Intuition
• Intuition • Functionality assessment
• Completeness assessment• Formal Methods• Algorithm review
• Input range validation• Control parameter review• Component testing
• SME validation
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Phase II – The Way Ahead
Mr. Edmund Bitinas
Northrop Grumman
01 October 2007
Phase II Objective
• Apply the Phase I-developed pilot framework for VV&A of Agent Based Simulations (known as ABSVal) to model applications being considered for future entry into the USMC Irregular Warfare Analytic Baseline
• Goals of Phase II include: – Testing the viability and utility of pilot ABSVal framework in
a realistic institutional setting; – Evaluating ABSVal in a seminar setting combining
communities of ABS users and developers; – Developing methodologies for applying ABSVal to future
ABS development efforts; and – Producing informational products useful for the M&S
community.
Phase II Team
• Marine Corps Combat Development Command / Operations Analysis Division (MCCDC/OAD)
• Northrop Grumman Mission Systems• WernerAnderson, Inc.• Sanderling Research Corp.• Visco Consulting• Systems Planning and Analysis (SPA)• Naval Post-Graduate School (NPS)
General Approach• Objective: Test the pilot ABSVal framework in a
realistic setting(s)– Is it useful?– Is it complete?– Are there improvements?
• Select candidate ABS models• Determine the utility for a specific application
– How can the results be meaningful/useful?– Are there limitations?
• Are there work arounds?
• Demonstrate techniques that can invalidate• Document everything
– Expand/modify ABSVal framework as required
How You Can Help!
• Identify candidate ABS–Application pairs– Completed studies– On-going studies– Planned studies
• Identify invalidation techniques– May only be applicable to some ABS-application
pairs– May only apply to part of the modeling/analysis
cycle
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Preview of Pythagoras COIN
Mr. Edmund Bitinas
Northrop Grumman
01 October 2007
Pythagoras COIN Applicationfor IW Study
• Pythagoras Tool – mapping of its attributes, features and functions
• COIN Scenario for IW – Population dynamics– Influence of various actors on population
segments• MAGTF Commander’s Courses of Action (COAs)• Insurgency actions
– Population segments broken up into orientation sectors
• Insurgent, Pro-Insurgent, Indifferent, Pro-COIN, COIN– Columbia-based
Conceptual Model of a Population Segment
Each Population Segment Has Its Own “Bubbles” – i.e. Orientations•The people within each Bubble may change over time
•Top arrows indicate movement toward the COIN•Bottom arrows indicate movement toward Insurgency•“Return” arrows indicate people remaining within the Bubble
Insurgent IndifferentPro-
Insurgent
Perception of COIN Success
Perception of Insurgency Success
Pro-COIN
Effect of Influence Estimation on Target Population
Population Segment A
0
10
20
30
40
50
60
70
80
90
Insurgency Pro-Insurgency
Indifferent Pro-COIN
Orientation
Nu
mb
er o
f P
eop
le
Initial Orientations
Revised Orientations
Population Segment A
Base Susceptibility * Strength of Event
Percent Change in Population Segment A from Initial State
Insurgency
Pro-Insurgency
Indifferent
Pro-COIN
-40.00%
-20.00%
0.00%
20.00%
40.00%
60.00%
80.00%
100.00%
Orientation
Per
cen
t C
han
ge
Percent Changefrom Initial State
Population Affiliation Over TimeSample Results From Test Problem
0
2000
4000
6000
8000
10000
12000
1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81
ActivistJungleLocals Ins
ActivistJungleLocals Pro-Ins
ActivistJungleLocals Indiff
ActivistJungleLocals Pro-COIN
Events at weeks 2, 7, and 29
Near steady-state at week 80
Operation Pacific Breeze (OPB)• Humanitarian Assistance/Disaster Relief (HA/DR)
– Volcanic eruption, earthquake & tsunami on Columbia’s coast– Organization of American States (OAS) asks for help– Provinces of Valle Del Cauca and Cauca have been assigned to the
Marine Corps– Port of Buenaventura is virtually destroyed
• Marine Corps deploys a Marine Expeditionary Unit (MEU) and a Marine Expeditionary Brigade (MEB) with the missions:– Carry out HA/DR in response to the tsunami– Maintain close coordination with the Government of Colombia– Establish security until the Government of Colombia is capable of
taking control of the disaster area– The Revolutionary Armed Forces of Colombia (RAFC) are predicted to
take advantage of the unstable situation– Other anti-Government groups organizing/reorganizing as well– The Joint Task Force will make every effort to prevent RAFC activities
in the area– Coordinate with the United States Agency for International
Development,, U.S. State Department personnel and non-government relief organizations
OPB (Continued)
• Coarse of Action #1 – Remain afloat– Minimize footprint ashore
• Only security personnel spend the night ashore
– Limit political impact of US presence– Make it clear that the Marine Corps is there for HA/DR only
• Coarse of Action #2 – Deploy ashore– Maintain bases ashore– Provide security– Relocate refugees
• Measure of Effectiveness– Increase/decrease in insurgent activity and support
Pythagoras COIN Model
• Conceptual model of populace interactions– Population changes affiliations naturally
• News reports, events, economy, ‘narratives’, etc.
– Population segments influence one another• Actions, economy, etc. = Salience
– MAGTF actions change the rate of change• Some good, some bad
– The magnitude of the rate of change can be estimated
• Mapping of conceptual model to Pythagoras features– Multiple agent classes represent population segments
• Attributes represent affiliation
– Influence weapons change attributes• Absolute and relative (be more like me)
Pilot Validation Methodology for Agent-Based Simulations
Workshop
LUNCH BREAK
1230 - 1330
Please Return To Auditorium After Lunch
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Sample Methodology Approach (Applying the Framework to Pythagoras COIN)
Mr. Edmund Bitinas
Northrop Grumman
01 October 2007
Is The Model Built “Right”?
• Do citizens change affiliation?– At the expected
rates?
• Do COAs change the rates?– By the right
amount?
Is The Model Built “Right”?
• Is ‘Narrative Paradigm’ applicable?– May be an assumption
• Is the data reasonable?– What is the source?– Are there real world examples?
• Are they relevant?
• Are all the known interactions present?– Can missing ones be easily added?– Are there any that should not be there?
• What are the model’s limitations?– Are there certain bounds on the inputs?
Possible Invalidation Techniques
• Assumption Testing
• Black Box Testing
• Turing Test
• Results Validation
• Comparison to other model(s)– Excel model (much less functionality)– JAVA model (less functionality)
Possible Guidance For User/Decision Maker
• Use Design of Experiments– Data farming on all variables
• Identify chaos points (if any)– Small inputs make great changes in
outcome– Model may not be valid near these points
• Break the problem into pieces– Different assumptions for different
population segments
Pilot Validation Methodology for Agent-Based Simulations
Workshop
BREAK
1350 - 1400
Move to Breakout Rooms for
Topic #1: Constructive Critique of Framework
(1400-1520)
Pilot Validation Methodology for Agent-Based Simulations
Workshop
Wrap-Up Discussion of Critique Sessions
1530 - 1630