cognitive biases: the root of irrationality in military ... · a sni gapore armed forces (saf) hgih...

12
INTRODUCTION Operational staff who had previously participated in a Singapore Armed Forces (SAF) high level exercise would understand that they need to provide multiple options for decision makers. These options include different courses of military action straddling the strategic and operational levels. Almost always, there are no perfect solutions; operational staff would therefore need to rationally review and compare the associated costs and benefits across the various options. Indeed, the SAF’s doctrine articulates the need to logically outline the strategic pay-offs and political costs for military options to facilitate decision-making. Similarly, established foreign militaries provide decision-making matrices, which detail the pros and cons of each option, to support the logical comparison across options. 1 In assessing the costs and benefits of the options, the operational staff must carefully consider two important factors: (1) the magnitude of each cost and benefit; and (2) the probability of each cost and benefit materialising. It is interesting to note that individuals can differ remarkably in their estimation of such probability. Consider Napoleon’s discussion with his top lieutenants over whether they should invade Russia in 1812. The latter warned against the invasion, citing significant risks to blood and treasure, as well as a high risk of failure. Napoleon, however, saw his invasion plan as swift, decisive and hence low-risk. 2 As a matter of fact, there can only be one real probability value for each cost and benefit materialising. Granted, we often cannot calculate that exact value, owing to the lack of perfect information in the real world. However, if we assume that the operational staff are rational, that they share the same view point 61 POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2 Cognitive Biases: The Root of Irrationality in Military Decision-Making by CPT Chen Jing Kai Abstract: In this essay, the author explores the different types of military decision-making by the operational staff and how it might affect the rationality of their assessment of the current situation The author then provides examples for each different type of bias, which include overconfidence bias, confirmation bias, disconfirmation bias, availability bias, sunk cost fallacy and anchoring bias and proposes ways to mitigate such biases, so that the Singapore Armed Forces (SAF) can be more accurate in their assessment and thus make better decisions. The author concludes by highlighting that in combination with contextual factors, cognitive biases have been shown to result in starkly inaccurate assessments and therefore poor military decisions at the strategic and operational levels. He highlights that the SAF needs to be cognisant of these biases and to implement strategies to counteract them. Keywords: Cognitive Bias; Decision-Making; Logical; Probability; Magnitude

Upload: others

Post on 02-Jan-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

INTRODUCTION

Operational staff who had previously participated in

a Singapore Armed Forces (SAF) high level exercise would

understand that they need to provide multiple options

for decision makers. These options include different

courses of military action straddling the strategic and

operational levels. Almost always, there are no perfect

solutions; operational staff would therefore need to

rationally review and compare the associated costs

and benefits across the various options. Indeed, the

SAF’s doctrine articulates the need to logically outline

the strategic pay-offs and political costs for military

options to facilitate decision-making. Similarly,

established foreign militaries provide decision-making

matrices, which detail the pros and cons of each option,

to support the logical comparison across options.1

In assessing the costs and benefits of the options,

the operational staff must carefully consider two

important factors: (1) the magnitude of each cost

and benefit; and (2) the probability of each cost and

benefit materialising. It is interesting to note that

individuals can differ remarkably in their estimation of

such probability. Consider Napoleon’s discussion with

his top lieutenants over whether they should invade

Russia in 1812. The latter warned against the invasion,

citing significant risks to blood and treasure, as well

as a high risk of failure. Napoleon, however, saw his

invasion plan as swift, decisive and hence low-risk.2

As a matter of fact, there can only be one

real probability value for each cost and benefit

materialising. Granted, we often cannot calculate that

exact value, owing to the lack of perfect information

in the real world. However, if we assume that the

operational staff are rational, that they share the same

view point 61

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

Cognitive Biases: The Root of Irrationality in Military Decision-Making

by CPT Chen Jing Kai

Abstract:

In this essay, the author explores the different types of military decision-making by the operational staff and how it might affect the rationality of their assessment of the current situation The author then provides examples for each different type of bias, which include overconfidence bias, confirmation bias, disconfirmation bias, availability bias, sunk cost fallacy and anchoring bias and proposes ways to mitigate such biases, so that the Singapore Armed Forces (SAF) can be more accurate in their assessment and thus make better decisions. The author concludes by highlighting that in combination with contextual factors, cognitive biases have been shown to result in starkly inaccurate assessments and therefore poor military decisions at the strategic and operational levels. He highlights that the SAF needs to be cognisant of these biases and to implement strategies to counteract them. Keywords: Cognitive Bias; Decision-Making; Logical; Probability; Magnitude

61-72_Cognitive Biases.indd 61 14/6/16 4:48 PM

information and that they put it on the same decision

matrix as dictated by military doctrine, we should

expect the estimated probabilities to be similar. This is

often not the case because we are not always rational

in reality. Indeed, research in behavioural economics

has demonstrated that we are susceptible to cognitive

biases, which are patterns of deviation in judgment

that lead to irrationality.3

Borrowing from such research, this essay seeks to

explore the common types of cognitive biases that

might skew the operational staff’s rational assessment

of probabilities and hence affect decision-making

at the strategic and operational levels. These biases

include the overconfidence bias, confirmation bias,

disconfirmation bias, availability bias, sunk cost fallacy

and anchoring bias. This essay then proposes ways to

mitigate such biases.

OVERCONFIDENCE BIAS

“Guard against arrogance, avoid underestimating

the enemy, and be well prepared.”

- Mao Tse-Tung4

Before the final battle of the Chinese Civil War

in 1949, Mao Tse-Tung ordered his coastal army

commanders to guard against the overconfidence bias.5

This is a cognitive bias that result in the inaccurate

calibration of probabilities, by leading individuals to

falsely believe that they are more accurate in their

judgements than they actually are. Psychological

experiments have provided ample evidence for this

concept. In a spelling task, only about 80% of the

subjects’ responses were correct when they were ‘100%

certain’ of their answers.6

The overconfidence bias is more pronounced when

the individual of concern is an expert in the discipline.7

In the discipline of Warfighting, there are few experts

who could have rivalled Napoleon Bonaparte, hailed

by Carl von Clausewitz as ‘The God of War’. Prior to the

invasion of Russia, Napoleon had already planned and

orchestrated an impressive 35 victorious campaigns and

only losing three. As mentioned previously, Napoleon’s

subjects had painted a bleak prospect for the Russian

campaign. Nevertheless, Napoleon’s stellar track record

gave him boundless confidence that ‘through sheer

force of will’, he would be able to surmount the glaring

challenges he faced and subjugate Russia. The over-

confidence bias could never have been more obvious,

with the palpable risks of the campaign nullified in

Napoleon’s mind. Alas, the terrible miscalculation of

risks and consequent attempt to invade Russia saw

Napoleon’s 500,000-strong Grand Army reduced to

less than 20,000 men and thwarted his ambition to

dominate Europe.8

CONFIRMATION AND DISCONFIRMATION BIAS

“It is a capital mistake to theorize before one has

data. Insensibly, one begins to twist facts to suit

theories, instead of theories to suit facts.”

– Sherlock Holmes9

In the novel, A Scandal in Bohemia, by Sir Arthur

Conan Doyle, Sherlock Holmes had found a mysterious

letter in the post. Curious about what the letter

actually meant, Watson, Holmes’ trusty companion,

had asked for his theory on this matter. As we can

see from his response, Holmes, ever the fine detective,

was careful to avoid confirmation bias, which is the

tendency to favour information that confirms one’s

hypothesis, regardless of its credibility.

The overconfidence bias is more pronounced when the individual of concern is an expert in the discipline

This bias causes individuals to selectively search

for evidence that supports their preconceived

view point 62

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 62 14/6/16 4:48 PM

hypotheses and believe that such evidence has a

disproportionately higher probability of being true,

than in reality. Confi rmation bias was brought to its

extreme during the series of events leading up to

the Japanese attack on Pearl Harbour in 1941. Prior

to being fundamentally surprised, the top brass in

Washington clung to the deep-seated belief that

Tokyo was incapable of mounting a raid on Pearl

Harbour.10 Among a series of assessments fraught with

confi rmation bias, the top brass readily accepted the

ridiculous proposition that the Japanese would not

be able deliver their bombs accurately for a successful

raid, given their severe myopia—attributable to the

shape of their pilots’ eyes.11

The fl ipside of confi rmation bias is disconfi rmation

bias. This is the tendency for individuals to set more

stringent standards of evidence for hypotheses other

than their own.

Disconfi rmation bias contributed signifi cantly to

the fundamental surprise of the Israeli Defence Force

by the Egyptian offensive, during the Yom Kippur War

in 1973. Then Director of Military Intelligence, Major

General Eli Zeira was instrumental in developing ‘The

Concept’, which captured assumptions about Egypt’s

military strategy. ‘The Concept’ articulated that

Egypt would only attack Israel if it had the air power

generation capability to strike Israel’s rear operating

air bases, so as to reduce their air superiority. As Egypt

would only have achieved this capability at least two

years later, Zeira assessed that Egypt would not attack

Israel.12 Little did Zeira know that Anwar Sadat, the

Egyptian leader, sought to retake the Suez Canal, rather

than defeat the Israeli forces. Therefore, the Egyptian

forces’ lack of air power was not a showstopper. On 6th

October, 1973, 100,000 Egyptian troops commenced

Napoleon's withdrawal from Russia, a painting by Adolph Northen.

Wik

iped

ia

view point 63

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2 POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 63 14/6/16 4:48 PM

their attack on the 450 Israeli soldiers along the Suez

Canal. The Israelis were fundamentally surprised.

Along the way, Zeira committed several

disconfirmation biases. Before the attack, the Israelis

had clear evidence of the Egyptians’ attack plans and

military deployment. However, Zeira discounted the

possibility that Egyptian forces were massing for an

attack and chose to situate the evidence in the context

of the Egyptian forces conducting a training exercise,

buying into the Egyptian forces’ deception plan. This

was despite the lack of evidence for actual training

being conducted on the ground. Even when Jordan’s

King Hussein undertook a clandestine flight to Tel

Aviv to warn the Israeli Prime Minister about Syria and

Egypt’s intention to attack Israel, Zeria was unmoved.

Disturbingly, even when an operative at the highest

levels of Egypt’s government sent a warning to Israel

just hours before the offensive, Zeria chose not to

believe the report. It was proposed that his insistence

on a flawed theory and playing down of contrary

indicators contributed to his failed assessment of

Egypt’s intention to attack.13

AVAILABILITY BIAS

“(Post 9/11), 1,500 Americans died on the road in

the attempt to avoid the fate of the passengers who

were killed in the four fatal flights… This estimate

is six times higher than the (latter).”

- Gerd Gigerenzer14

Gigerenzer, a German Psychologist, observed that

post 9/11, people flew substantially less (reduction

of 12-20%) and drove substantially more (mileage

clocked by cars on interstate highways increased

between 2.2-5.7%). Gigerenzer offered that many

might have chosen driving over flying, probably in fear

of experiencing another terrorist attack. At the same

time, the number of fatal road accidents every month,

for the 12 months following September 2001, turned

out to be significantly higher than the baseline for the

preceding five years. Overall, it was estimated that six

times more Americans were killed on the road as they

tried to avoid the risk of flying, than the fatalities

from the four fatal flights in the 9/11 incident.15 While

the cause of increased land-based travel is probably

multi-factorial in reality, this serves as a conceptual

example of the availability bias.

The availability bias refers to the tendency for individuals to assess probabilities based on how readily they can retrieve memories of the issue at hand.

The availability bias refers to the tendency for

individuals to assess probabilities based on how readily

they can retrieve memories of the issue at hand.

Retrievability of memories, in turn, will depend on how

familiar, vivid or emotionally salient these memories

are. Several studies, including the one conducted by

Gerd Gigerenzer, have suggested that shortly after a

terrorist attack, people tend to believe that another

attack is much more probable than it is, in reality. In

fact, people tend to be more worried about the risk of

another terrorist attack than other statistically more

significant risks that they face in everyday life.16

Granted, this bias could be beneficial, because

“if one thing actually occurs more frequently

and therefore is more probable than another, we

probably will be able to recall more instances of it.”17

However, it is also important to remind ourselves

that the emotional salience and vividness of a

memory also determine how readily it is retrieved.

This has implications for the interpretation of rules

of engagement during military operations. As the

decision maker deliberates on whether to target an

individual undertaking, an act that might or might

view point 64

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 64 14/6/16 4:48 PM

not be threatening, the ease of retrievability of

recent or salient examples of others undertaking

similar acts will invoke the availability bias.

On September 2009, a German commander ordered

an airstrike on North Atlantic Treaty Organisation

(NATO) fuel trucks that had been hijacked by Taliban

forces. The hijacked trucks had gotten stuck while

crossing the Kunduz River and dozens of civilians had

gathered around the trucks to siphon free fuel from

the tankers. As a result, the airstrike killed up to 142

people, including over 100 Afghan civilians.18 Despite

the fact that the trucks had not moved for more than

four hours, the German commander had perceived an

‘imminent threat’ (this being a necessary condition

for ordering an airstrike), noting that “my feeling was

that if we let them get away with these tankers, they

will prepare them to attack police stations or even

the (Provincial Reconstruction Team).”19 The German

commander seemed to have based his decision on

recent and salient events. Indeed, Taliban fi ghters had

detonated a tanker truck in Kandahar not too long ago.20

Moreover, there had been several other hijackings of

the German Provincial Reconstruction Team’s tanker

trucks by the Taliban in Kunduz.21 In addition, just one

month before the airstrike, intelligence had suggested

that the Taliban was planning to overrun the German

camp, using explosive-laden trucks.22 All these easily

retrievable events could have contributed to the

availability bias that affected the German commander’s

assessment of the seriousness and imminence of the

threat that the hijacked trucks had posed, ultimately

leading to his ill-fated decision.23

An American F-15E Strike Eagle similar to one used in the attack on NATO fuel trucks that had been hijacked by Taliban forces in 2009.

Wik

iped

ia

view point 65

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2 POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 65 14/6/16 4:48 PM

SUNK COST FALLACY

“Throw good money after bad”

– English Proverb24

This proverb describes people’s hopeless

commitment of more resources to reinforce a poor

decision that has already proven to be costly. In the

end, instead of recouping their previous loss, they

incur further loss. Such an irrational escalation of

commitment of resources can often be explained

by the ‘sunk cost fallacy’. This is the tendency for

people to justify additional investment based on

how much resources they have already invested in a

decision—and that they will not be able to get back

(therefore termed ‘sunk costs’). This occurs despite

evidence demonstrating that the potential costs

for maintaining the decision will be significantly

greater than the potential benefits.

Sunk cost fallacy, in the case of military strategy,

could result in the unnecessary protraction of wars

due to fears that war-termination would result

in wastage of the human lives, money and time

previously expended. This could, in turn, result

in a greater loss of human lives, money and time.

Some have argued that the United States’ (US)

long-drawn involvement in the Vietnam War was

an excellent example of this bias. Back then, a key

argument made by supporters for the war was that

the US’ withdrawal from Vietnam would disparage

the many who had already sacrif iced their lives,

letting them ‘die in vain’. Therefore, the country

‘owed’ it to these war heroes to ‘stay the course’.

This line of reasoning, reinforced by mounting

numbers of individuals who ‘sacrif iced their lives’,

arguably contributed to the protraction of the

Vietnam War and, therefore, produced another

250,000 more casualties.25

ANCHORING BIAS

“What should we price it at? If you listen to the pundits, we’re going to price it at under $1000, which is (the) code for $999. <<long pause>> I am thrilled to announce to you that the iPad pricing starts not at $999, but at just $499!”

– Steve Jobs26

As the former CEO of Apple introduced the newly

launched iPad back in 2010, he made sure to fully exploit

the anchoring bias. This refers to people’s tendency

to place too much emphasis on the initial piece of

information provided (the ‘anchor’) for subsequent

judgments during decision-making. Once the anchor is

determined, subsequent judgments will be calibrated

using the anchor as a reference point. Consequently,

the decision-maker will be biased towards interpreting

subsequent information obtained around the anchor.

Steve Jobs, Apple's then CEO, introducing the iPad.

Wik

iped

ia

view point 66

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 66 14/6/16 4:48 PM

Sunk cost fallacy, in the case of military strategy, could result in the unnecessary protraction of wars due to fears that war-termination would result in wastage of the human lives, money and time previously expended.

By stating the ‘initial’ cost at $1,000, Steve Jobs

anchored the audience’s minds on the idea that an iPad

costs approximately $1,000. When he subsequently

announced the actual price, the audience felt that

they had saved $500. If Steve Jobs had said, “We were

thinking of pricing it at $399, but we decided to go for

$499,” the audience would certainly have felt that they

were being swindled—although the actual iPad price

would have been the same across the two scenarios.27

During World War Two (WWII), the British

orchestrated a deception ploy, known as the Cyprus

Defence Plan, which exploited the Germans’ anchoring

bias. With the Germans’ capture of Crete, the British

feared that the mere 4,000 soldiers on Cyprus would

not be capable of defending against a German

offensive. Therefore, the British implemented the

Cyprus Defence Plan, which was intended to convince

the Germans that 20,000 troops were stationed on the

island. They achieved this by creating a false division

headquarters, barracks and fleet of military vehicles

together with fabricated radio communications. The

British also distributed a phony defensive plan with

maps, graphics and orders through double agents. The

Germans were successfully deceived and were anchored

on ‘20,000 troops’, factoring this figure into their

planning, for the remaining three years of the war. It

is believed that the anchoring bias persisted despite

their subsequent analysis that this figure could have

been an exaggeration, attesting to the resistance of

the anchor to subsequent information.28

SOLUTIONS

Cognitive biases impair our assessment of the

probability of each cost and benefit materialising.

This, in turn, impairs military decision-making at the

strategic and operational levels. If we were to refer to

the Observe–Orient–Decide–Act (OODA) loop, cognitive

biases can exert their detrimental effects during the

search (‘Observe’) and interpretation (‘Orient’) of data

leading up to a decision, as well as, the decision-

making process itself (‘Decide’). The preceding sections

have covered examples for this in depth.

Cognitive biases can be amplified by ‘group-think’,

which is ‘the practice of thinking or making decisions as

a group, resulting typically in unchallenged and poor-

quality decision-making’. Once the cognitive biases

are combined with ‘group-think’, the biases remain

uncorrected and there can be disastrous consequences.

Therefore, it is imperative for the SAF’s operational

staff to employ strategies that mitigate these cognitive

biases, to allow for more accurate assessment of

probabilities and therefore facilitate better decision-

making.29 These strategies would need to be grounded

on an open culture, in which operational staff are

encouraged to be honest about their views.

The following section describes how checklists,

red teaming, games and fresh eyes can be employed to

counteract cognitive biases at different stages of the

Observe-Orient-Decide-Act (OODA) loop. Please refer to

Figure 1.

Checklists

Checklists promote methodical thinking of

issues. Some judges adopt this tool to promote

comprehensiveness and objectivity as they rule on court

cases, so as to counter cognitive biases.30 Checklists

are also used in the SAF when we evaluate our units,

view point 67

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2 POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 67 14/6/16 4:48 PM

prepare our forces for operations and make certain

important fi nancial decisions. Likewise, checklists can

be used effectively in the SAF Command Post to counter

cognitive biases throughout the ‘Observe’, ‘Orient’ and

‘Decide’ steps. In formulating and following checklists,

the operational staff will be forced to comprehensively

search for, list down and consider all relevant factors.

The rigorously completed checklist will subsequently

serve as a thorough guide for decision-making. It

would include factors such as:

(1) Both salient and less salient events—The checklist

should comprehensively include both recent salient

events that support the working hypothesis, as

well as other events that might be less salient but

nonetheless relevant. In doing so, the operational

staff would be less reliant on their memories

and not be beholden to readily retrievable ones.

They would therefore be less susceptible to the

availability bias.

(2) Disconfirming evidence and alternative

hypotheses—Other than supporting evidence,

the checklist should also include disconfirming

evidence that does not conform to the favoured

hypothesis, as well as include alternative

hypotheses. By systematically reviewing these

factors, the operational staff can mitigate

confirmation and disconfirmation biases.

Red Teaming

To overcome cognitive biases, there is a need to

consider alternative hypotheses for the information

at hand. However, research has demonstrated that

people, on their own, are rather inept at generating

such alternative hypotheses.31 The SAF Command Post

Figure 1: Strategies to Counter Cognitive Biases in the OODA Loop

view point 68

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 68 14/6/16 4:48 PM

could overcome this by institutionalising red teaming

throughout the ‘Orient’ and ‘Decide’ steps. A red team,

separate from the main operational staff, can be

created to dedicatedly conceptualise viable alternative

hypotheses to explain the data and to subsequently

propose decision-making options.

President John F. Kennedy employed red teaming

during the Cuban Missile Crisis. He split his staff

into separate groups and designated his brother as

the ‘devil’s advocate’. His advisors were therefore

compelled to rigorously critique and defend their own

assumptions and to creatively generate alternative

responses to the crisis.32

Red teaming is also an approach adopted by

established militaries. The United Kingdom’s (UK)

Joint Doctrine Publication 5-00, Campaign Planning

lays out the purpose of red teams: “to challenge the

perceived norms and assumptions of the commander

and his staff.”33 Similarly, the US Army Training and

Doctrine Command specially chartered a university

to conduct a suite of red teaming courses for its

commanders and staff.34

Alternative hypotheses generated by a red team

will help prevent the main operational staff from

ignoring contrary information (disconfirmation

bias) and over-emphasising supporting evidence

(confirmation bias). A formidable red team will also

serve as a check against the excessive confidence of

the main operational staff and also point out when

sunk cost fallacies have been committed.

Games

The SAF has developed an array of simulation systems

to create virtual scenarios for training purposes. Such

systems, from single platform simulators to the Multi-

Mission Range Complex, have proven to be effective in

enhancing our servicemen’s warfighting competencies.

Similarly, simulation technology can be used to train up

our operational staff’s ability to counteract cognitive

biases. For instance, Raytheon funded the design of

a decision-making game that might help intelligence

analysts detect and mitigate cognitive biases during

their course of work. This game simulates scenarios

based on actual situations that US intelligence

analysts experienced in Iraq. These scenarios were

developed from a compilation of digital documents and

reports from theatre.35 Similar simulation technologies

can be applied to train our operational staff’s ability

to mitigate cognitive biases across the stages of the

OODA loop.

Fresh eyes

To the extent that the operational staff aggregate

data for decision-making over a period of time,

they can be susceptible to the anchoring bias and

confirmation/disconfirmation bias. To mitigate this

bias, the SAF could institutionalise the requirement

for an individual, who was not involved in the prior

information analysis, to step into the process at the

end of the data collection process (after the ‘Observe’

stage, before ‘Orient’). His role would be to lend a

pair of fresh eyes as he reviews all the information

at once. We should expect this individual to be free

from the anchoring bias because he would not have

an anchoring piece of information to start with. We

should also expect this individual to be free from the

confirmation/disconfirmation bias. He would probably

not have had a preconceived hypothesis and therefore

would not have different thresholds of evidence.36

During the ‘Decide’ stage, another pair of fresh

eyes can be engaged to review the findings from the

data analysis process for sense-making—the ‘Orient’

Stage. Decision-making will be more objective as it

will similarly be protected against the anchoring and

confirmation/disconfirmation biases.

view point 69

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2 POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 69 14/6/16 4:48 PM

The SAF has previously employed the fresh eyes

strategy for project development, in peacetime.

On some occasions, major projects had been stalled

due to a lack of new ideas—dynamism in thinking

had probably been curtailed as the project members

developed fixed hypotheses over time, reinforced

by a plethora of cognitive biases. By establishing

working groups to lend pairs of fresh eyes, the SAF

had decisively broken such deadlocks. This solution of

using fresh eyes need not be restricted to a peacetime

developmental setting, but can also be used to mitigate

cognitive biases within the operational staff of the SAF

Command Post.

CONCLUSION

In assessing probabilities of costs and benefits,

prior to making staff recommendations for options,

our operational staff will be susceptible to a slew

of cognitive biases. In combination with other

contextual factors, cognitive biases have been

shown to result in starkly inaccurate assessments

and therefore poor military decisions at the strategic

and operational levels. Therefore, the SAF needs

to be cognisant of these biases and to implement

strategies to counteract them.

While not discussed in this essay, it should be

noted that cognitive biases also affect tactical

decision-making. For instance, a slew of biases

could affect a patrolling soldier’s decision on

whether a plainclothes man is a terrorist planting

an IED or just an innocent civilian. Cognitive biases

at the tactical-level can similarly be mitigated by

the abovementioned strategies. However, this falls

beyond the scope of the present essay.

Finally, cognitive biases might not always be

to our detriment; taking a leaf from the Cyprus

Defence Plan, we can also exploit our adversary’s

vulnerability to cognitive bias in our military

strategy. The resulting concerted use of strategic

and operational deception will bring us closer to

attaining Sun Tze’s concept of “breaking the enemy’s

resistance without fighting.”37

ENDNOTES

1. UK Ministry of Defence. Campaign Planning - Joint Doctrine Publication 5-00. 2nd Edition. Swindon: The Development, Concepts and Doctrine Centre, 2013.

US Joint Chiefs of Staff. Joint Publication JP 5-0 Joint Operation Planning. (Scotts Valley, CA: CreateSpace Independent Publishing Platform, 2011).

2. Kroll, Mark J, Leslie A Toombs, and Peter Wright. "Napoleon's tragic march home from Moscow: Lessons in hubris." n._ 1 (Academy of Management Executive 14, 2000): 117-128.

3 . Kahneman, Daniel. Thinking Fast and Slow. (New York, NY: Farrar, Straus and Giroux, 2013).

4. Li, Xiaobing. China's Battle for Korea: The 1951 Spring Offensive. (Bloomington, IN: Indiana University Press, 2014).

5 . Ibid.

6. Adams, P A, and J K Adams. The American Journal of Psychology 73, "Confidence in the recognition and reproduction of words difficult to spell." n._ 4, 1960, 544-552.

7. Griffin, Dale, and Amos Tversky. Cognitive Psychology 24, "The Weighing of Evidence and the Determinants of Confidence." 411-435.

8. Kroll, Toombs and Wright. “Napoleon’s tragic march.”

9. Sir Authur Conan Doyle. Adventures of Sherlock Holmes. 1892. https://books.google.com.sg/books?id=buc0AAAAMAAJ.

10. Lobe, Jim. "POLITICS-US: Events Echo Pearl Harnour Miscalculations." (Inter Press Service News Agency, 2001). http://www.ipsnews.net/2001/09/politics-us-events-echo-pearl-harbour-miscalculations/.

11. Davis, Kenneth C. Don't Know Much About History. (New York City, NY: Harper Collins, 2009).

12. Siniver, Asaf. The October 1973 War: Politics, Diplomacy, Legacy. (London: Hurst Publishers, 2013).

view point 70

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 70 14/6/16 4:48 PM

13. Klein, Gary. Everything that Follows is Different: The Disruptive Power of Insight. (New York City, NY: Public Affairs, 2013).

14. W. W. Norton. 2011 “The 9/11 Commission Report: Final Report of the National Commission on Terrorist Attacks Upon the United States (Authorized Edition)”.

15. Gigerenzer, Gerd. "Out of the Frying Pan into the Fire: Behavioral Reactions to Terrorist Attacks." Risk Analysis 26, n._ 2, 2006, 347-351.

16. Sunstein, Cass R. "Terrorism and Probability Neglect." The Journal of Risk and Uncertainty 26, n._ 2&3, 2003, 121-136.

17. Heuer, Jr, Richards J. "Strategic Deception and Counterdeception: A Cognitive Process Approach." International Studies Quarterly 25, n._ 2, 1981, 294-327.

18. Dempsey, Judy. "Berlin to Pay Afghan Families for Fatal Attack." (The New York Times, 2010). http://www.nytimes.com/2010/08/11/world/europe/11iht-germany.html?_r=0

19. Washington Post. Chandrasekaran, Rajiv. "Decision on Airstrike in Afghanistan Was Based Largely on Sole Informant's Assessment." 2009. http://w w w.w a s h i n g t o n p o s t . c o m/ w p d y n/c o n t e n t /a r t i c l e / 2 0 0 9 / 0 9 / 0 5 / A R 2 0 0 9 0 9 0 5 0 2 8 3 2 .html?hpid=topnews.

20. Spiegel Online. Spiegel Staff. "The End of Innocence in Afghanistan: 'The German Air Strike Has Changed Everything'." 2009. http://law.huji.ac.il/upload/6_AshleyDeeks_p.pdf.

21. "Afghanistan: Background to the Kunduz airstrike of 4 September 2009." (Amnesty International, 2009)https://www.amnesty.org/fr/library/asset/ASA11/015/2009/en/9 f e7aa75 - 8 8 4 a - 4 f4 4 - 82 f c -5c8208116985/asa110152009en.pdf.

22. Spiegel Staff. “End of Innocence”.

23. Deeks, Ashley. "Cognitive Biases and Proportionality Decisions: A First Look. (The Hebrew University of Jerusalem, 2014).http://law.huji.ac.il/upload/6_AshleyDeeks_p.pdf.

24. Whiting, Bartlett Jere. Early American Proverbs and Proverbial Phrases.

25. Schwartz, Barry. "Bush falls victim to a bad new argument for the Iraq war." (Slate, 2005). http://www.slate.com/articles/news_and_politics/hey_wait_a_minute/2005/09/the_sunkcost_fallacy.html.

26. Steve Jobs before the laugh of the iPad.

27. Amster-Burton, Matthew. "Price Anchoring, Or Why a $499 iPad Seems Inexpensive." (Mintlife, 2010). https://www.mint.com/blog/how-to/price-anchoring/?doing_wp_cron=1412508069.3243799209594726562500 (accessed October 5, 2014).

28. Williams, Blair S. "Heuristics and Biases in Military Decision Making." (Military Review, 2010), 40-52.

29. Definition in the Oxford Dictionary.

30. Guthrie, Chris, Jeffrey J Rachlinkski, and Andrew J Wistrich. "Blinking on the Bench: How Judges Decide Cases." Cornell Law Reivew 93, n._ 1 (2007), 1-43.

31. Heuer. “Strategic Deception and Counterdeception.”, 294-327.

32. Rachlinkski, Jeffrey J, and Cynthia R Farina. "Cognitive Psychology and Optimal Government Design." (Cornell Law Faculty Publications, 2002), 550-607.

33. UK Ministry of Defence. Campaign Planning.

34. United States Army Combined Arms Center. (University of Foreign Military and Cultural Studies - Red Teaming, 2013). http://usacac.army.mil/cac2/UFMCS/index.asp.

35. Penn State News. Swayne, Matt. "Games may help train intelligence analysts to overcome bias." 2012. http://news.psu.edu/story/144644/2012/11/13/games-may-help-train-intelligence-analysts-overcome-bias, 2014.

36. Deeks. "Cognitive Biases and Proportionality Decisions."

37. Sun Tzu. “Sun Tzu Quotes,” http://www.brainyquote.com/quotes/authors/s/sun_tzu.html.

view point 71

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2 POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 71 14/6/16 4:48 PM

CPT Chen Jingkai is currently Officer Commanding of Lion Company in 42 Singapore Armoured Regiment (SAR). He graduated, top of his cohort, from the University of Oxford with a Congratulatory 1st Class Honours in Psychology, Philosophy and Physiology. He subsequently graduated with a Masters in Applied Positive Psychology from the University of Pennsylvania. CPT Chen is an Armour Officer by vocation and was previously a Platoon Commander in 42 SAR. He also served as a Staff Officer in Joint Operations Department.

view point 72

POINTER, JOURNAL OF THE SINGAPORE ARMED FORCES VOL.42 NO.2

61-72_Cognitive Biases.indd 72 14/6/16 4:48 PM