Tag: Overconfidence

Strategic Instincts, by Dominic Johnson

Among political scientists and policy wonks, it’s widely believed that cognitive biases are not only detrimental, but responsible for some of history’s worst policy blunders.

Whether it’s the Bay of Pigs fiasco, Chernobyl, or the Global Financial Crisis, it’s easy to think of colossal disasters that back this up. But is this really the whole story?

In his new book Strategic Instincts, Dominic Johnson challenges this worldview, arguing that seemingly irrational behaviour can, with the right dosage and in the right context, actually enhance performance in the arena of international politics.

Drawing on insights from the field of evolutionary psychology, Johnson makes the case that cognitive biases act as ‘strategic instincts’, which can provide politicians with a competitive edge in policy making— particularly in high stakes, and highly uncertain, military confrontations.

To test his theory, Johnson takes his readers on a tour of recent history, and places a trio of influential cognitive biases— overconfidence, the ‘fundamental attribution error’, and ingroup favouritism— under the microscope.

Fortune favours the bold

Dominic Johnson is a Professor of International Relations at Oxford University. Holding not one but two PhDs (what have you achieved in life?), Johnson has dedicated his academic career to exploring how the latest scientific findings from evolutionary biology are challenging our understanding of global conflict and cooperation.

To start off our intellectual journey, Johnson turns the clock back to the 18th century and places us on the battlefield of the American War of Independence.

By combing the historical and biographical accounts of the Revolutionary War, Johnson argues that the United States benefited in no small measure from a remarkable confidence— arguably overconfidence—that emanated from its charismatic founder, George Washington.

Before delving further, what does Johnson mean when he says George Washington was ‘overconfident’?

A well-established finding from modern psychology is that mentally healthy people are generally overconfident (with men being particularly cocky). What this body of research suggests is that we individually tend to overestimate our abilities, as well as our control over life events, and we also underestimate our vulnerability to risk (although these are technically termed ‘positive illusions’, Johnson uses the catch-phrase ‘overconfidence’ for simplicity’s sake). In sum, we tend to think more highly of ourselves than who we really are (‘we’ largely being ‘Westerners’. More on this point later).

Despite the substantial personal and social costs of pervasive positive illusions, Johnson argues overconfidence, in moderate amounts and in right context, can present a wealth of benefits. For example, those of us who are the most optimistically overconfident reap many benefits in life, including greater health and career success. Similarly, Johnson’s research outlines how overconfidence can give armies an upper hand on the battlefield.

According to Johnson, George Washington’s remarkable overconfidence encouraged him to fight and sustain the revolution against the British— despite the formidable odds stacked against his rebels. In the long and gruelling war in which Americans lost most of the military battles against the Red Coats (and struggled to even keep an army in the field), Johnson argues ambition and boldness paid off handsomely for Washington and his comrades.

As stated by Johnson:

Britain was the world’s leading military and economic power before and after the war. It was by no means yet in decline and was the odds-on favourite to win the war, or at least to end it on favourable terms. I suggest that Washington, in no small part by virtue of his great confidence, was able to turn the tables and seize victory from the jaws of defeat, an achievement epitomised by his daring raid across the Delaware. Joseph Ellis wrote that “the British commander, William How, could probably have won the war and ended the American Revolution in November of 1776 with more aggressive tactics. The Delaware crossing thus becomes a sudden reversal of fortune, as if an American mouse, chased hither and yon by a British cat, brazenly turned turns about and declares itself a lion”. And lions, not mice, win wars.

Washington’s bold crossing the icy Delaware river on Christmas night, 1776, in what many deem as the turning point of the war. Emanuel Leutze (1816-1868). Image credit: Metropolitan Museum of Art.

Know your enemy

Next, Johnson revisits the lead up to World War II, where he probes Britain’s perceptions of Adolf Hitler in the 1930s. Here, Johnson draws on the fundamental attribution error— a psychological quirk that journalist Robert Wright describes as ‘the most underappreciated cognitive bias’— to illustrate the logic of strategic instincts.

For those of us less familiar with the theory, what is the error exactly? The late Lee Ross coined the term ‘the fundamental attribution error’ back in the 1970s, in a scientific paper that has had an enduring impact on social psychology. When explaining the behaviour of other people, Ross and his intellectual descendants found that people generally place too much emphasis on other’s disposition (that is, on their innate personality and essential ‘nature’). Conversely, people tend to put too little emphasis on the situation in explaining other’s behaviour (that is, on the circumstances that people find themselves in, and how this influences what they do).

What does this mean for international politics? Johnson clarifies that the fundamental attribution error doesn’t necessarily make us see other people’s behaviour as threatening, but it can make us see threatening behaviour as intentional. Similarly, Johnson claims that the fundamental attribution error can lead governments to be hyper vigilant and assume the worst from other states.

Intriguingly, Johnson makes the case for the fundamental attribution error’s absence in the lead up to World War II— and points to this as evidence for the cognitive bias’s effectiveness.

To elaborate, Britain’s appeasement of Hitler offers a reverse case where those in power behaved in contrast to what the fundamental attribution error would predict. At the time, the UK’s Prime Minister Neville Chamberlain strongly resisted attributing dispositional causes to Hitler’s behaviour, and instead emphasised situational causes (where Chamberlain stressed the German desire to redress the restrictions of the Treaty of Versailles, and to retain security over their remaining territory). Despite mounting evidence of Hitler’s real intentions, Chamberlain continued to give Hitler the benefit of the doubt— which Johnson claimed led to the disastrous policy of appeasement and the Munich crisis of 1938.

Neville Chamberlain at Heston Airport on his return from Munich after meeting with Hitler, September 1938. Chamberlain read out to the crowd the famous agreement, signed by Hitler, stating “the desire of our two peoples never to go to war with one another again”. Later that day, during a speech outside 10 Downing Street, he declared “peace for our time”. Image credit: Central Press/Getty Images.

Johnson therefore raises an unusual, counterfactual question: where was the fundamental attribution error when the world needed it most? After all, there were actors waiting in the wings whose thinking did in fact align with the fundamental attribution error (not least Winston Churchill, who insisted that Hitler was acting out of intentions to aggressively expand German territory and redouble their military power). Had the fundamental attribution error been stronger and more widespread among Western leaders at the time, Johnson argues that Britain could have stood up to Hitler earlier and more effectively, which as a result, would have avoided the atrocities of World War II (including the Holocaust, the Nazi’s genocidal campaign against the Jews).

United we stand

Johnson also argues that ‘ingroup favouritism’ (also known as the ‘ingroup-outgroup bias’) serves as a strategic instinct.

What is ingroup favouritism, and why does Johnson consider it to be so influential in the world of international politics? As part and parcel of our coalitional psychology, we humans have a powerful tendency to favour our own groups and its members, and disparage the ‘outgroup’. This bias is so strong and ubiquitous that ingroup favouritism is essentially the bread and butter of social psychology.

Needless to say, such prejudices can have horrific consequences. Johnson argues that throughout human history, the ingroup bias has contributed to the oppression of minority groups, inflamed ethnic conflict, and has been implicated in genocide.

However, Johnson claims that in smaller doses and in appropriate settings, ingroup favouritism can be relatively benign and ‘highly adaptive’. To elaborate, Johnson argues that ingroup favouritism increases cohesion as well as coordinated action against rival groups, which together can increase group’s survival and overall effectiveness.

To evaluate these claims, Johnson parachutes us back to the Pacific War between the United States and Japan. Through his combing of the historical artifacts, Johnson argues that the United States was able to persist and ultimately prevail in the long and brutal Pacific campaign against the Japanese partially due to ingroup favouritism helping bolster the war effort (that is, by increasing cohesion among military personnel, boosting support for the war among Americans at home, and by sealing the commitment of America’s political leaders).

Conversely, Johnson argues that the Japanese took outgroup animosity to extreme levels, which sowed the seeds of their downfall. (The Japanese certainly did not have a monopoly on dehumanisation. However, Johnson argues that Japan’s demonisation of Americans severely distorted their thinking, compromising their estimations of risk and ultimately their military strategy).

American infantrymen secure an area on Bougainville, Solomon Islands, in March 1944, after Japanese forces infiltrated their lines during the night. Image credit: AP Photo.

Overkill: The limits of strategic instincts

In Strategic Instincts, Johnson clearly acknowledges the dark side of these cognitive biases, especially when taken to excesses. That is, when overconfidence becomes hubris, when attribution errors manifests into paranoia, and when ingroup favouritism fuels discrimination and racism. As stated by Johnson; “the in-group/ out-group bias can obviously be a serious impediment to cooperation, peace, and equality between different groups, and we must strive to reduce or manage it wherever it has malicious effects.”

Johnson also points out the pitfalls of only psychologising instances where things go wrong. If we want a complete understanding of how cognitive biases impact our decision making, we must count both sides of the ledger. To this end, Johnson argues that an evolutionary perspective offers the crucial next step in incorporating psychological insights into the field of international politics.

One may question Johnson’s emphasis on competition and military conflict, rather than on international cooperation. Notwithstanding the oversight of international institutions including the United Nations, Johnson asserts that the world lacks a global ‘Leviathan’, and is therefore engulfed in competitive anarchy. Despite his Hobbesian worldview, Johnson stresses that global cooperation is possible, and he’s hopeful of humanity’s ability to overcome seemingly insurmountable challenges.

Johnson writes:

We need inspiration for solving the collective action problems at the global level more than ever. Here, the current challenges to the planet are especially daunting and difficult, ranging from expanding populations and dwindling resources to species extinction and climate change. Some inspirational confidence in the face of such challenges could make all the difference.

I think Johnson makes several compelling arguments in Strategic Instincts. That said, there are elements of the book that one could critique. What stood out the most for me is that the latest findings from the field of cultural evolution appear to be missing.

Exhibit A: Although overconfidence is a well-established finding from the field of psychology, cultural evolutionary theorist Michael Muthukrishna and his colleagues have detailed significant cultural differences in how overconfidence manifests.

Much of the literature cited by Johnson has been conducted with North Americans and Western Europeans. However, we now know that these people are really WEIRD. That is, they are Western, educated, industrialised, rich and democratic, which makes them psychological outliers among the world’s diverse inhabitants of humans. In other words, cognitive biases are not intrinsic human universals, but are prevalent to varying degrees, and manifest in different ways, across human populations.

These points aside, Johnson arguably makes a key contribution not only to the field of international relations, but to psychology itself. For various reasons, seeing cognitive biases as inherently detrimental has become the default assumption of many behavioural scientists. To challenge this prevailing view, Johnson eloquently illustrates how cognitive biases can confer advantages in politics— and what the implications of this are for everyday life.

Clash of the titans

What light can Strategic Instincts shed on global politics today? Johnson argues that the biggest question facing international relations may be rising tensions between the United States and China.

Whilst the United States retains its hegemony in many spheres, Johnson implies that America’s dominance is quickly diminishing. If this trajectory continues, the implications are enormous.

As stated by Johnson:

History tells a gloomy story about rising states. They have rarely risen peacefully, either because they begin a quest for expansion or because other states act to prevent from them doing so [sic], or from acquiring the ability to try. Normally, in such a context of rising tensions, overconfidence would be cited as an outright danger. The United States (or China) is likely to overestimate its own capabilities, exaggerate its own level of control over events, and maintain overoptimistic predictions about the future, all of which would seem to increase the probability of deterrence failure, crisis, and war.

Several intellectuals take comfort in the fact that war has been in steady decline since World War II (what Steven Pinker refers to as the ‘Long Peace’ in his book The Better Angels of Our Nature). However, Johnson clarifies that “while war may be in decline among states, competition is not”.

As the human population bulges and temperatures continue to rise worldwide, Johnson suggests environmental stressors may load the dice and increase the odds of territorial disputes.  

The modern world is infinitely complex, and experts are generally pretty terrible at predicting the future. That said, pundits point to an uncomfortably high chance of a confrontation between the United States and China over Taiwan in the medium-term.  

Beyond territorial expansion, Johnson clarifies that states are vying for strategic advantages in a myriad of ways. A good illustration of this is the new arms race sparked between the United States and China over artificial intelligence (AI).

Whilst evolutionary psychology excavates the remnants of our evolutionary past, AI researchers are taking insights from evolution to build the digital minds of tomorrow. Both countries are now ploughing large sums into artificial intelligence for military purposes, including the development and deployment of autonomous weapons. Whilst AI may prove to be a boon for humanity, experts warn that without proper oversight and regulation, humanity risks ‘losing control’ of AI.

Similarly, Johnson appears to be concerned about increasing technological and social complexity. As the scale and complexity of political interactions continues to increase, Johnson warns that mistakes and misunderstandings may become more difficult to avoid. In a bipolar world where two military superpowers are continuously on red alert, the risks of accidental escalation looms large (with the mass deployment of automated weaponry, one may worry not so much about ‘artificial intelligence’, but of ‘artificial stupidity’).

Despite his sobering analysis, Johnson seems confident (perhaps overconfident?) that the United State’s strategic instincts will save the day:

Even if the disadvantages are genuine and likely to cause mistakes, the advantages could outweigh them, avoiding mistakes in the other direction that would be even more costly. It would be easy for a state to shrink back in the face of a rising power, anxious to avoid conflict and hesitant or unwilling to commit to the bold actions necessary to assert and preserve its position. This was certainly Chamberlain’s problem in the 1930s, and Britain paid the price— losing its empire, bankrupting the nation, and nearly suffering an invasion of his homeland as well. As we have seen, overconfidence can serve to increase ambition, resolve, and perseverance, helping to exploit opportunities, deter enemies, attract allies, and provide a competitive edge in strategic interactions. While drawbacks of overconfidence certainly remain in the mix, all of these advantages could conceivably help the United States consolidate and preserve its position vis-a-vis China in the coming years.

Here’s the 30 trillion-dollar question: will overconfidence help avert a large-scale military conflict between the United States and China, or will it inevitably lead to disaster?

Written by Max Beilby for Darwinian Business.

Strategic Instincts: The Adaptive Advantages of Cognitive Biases in International Politics is published by Princeton University Press. Click here to buy a copy.

Article updated on the 27th July 2021.

The evolutionary logic of overconfidence

People in general are overconfident, excessively optimistic and think of themselves as superior to others.

How can I say such a thing?

Ask anyone with a license to rate their driving abilities, and most people will tell you that they are above average. However, this is not just an isolated case of cocky drivers. The same effect is found when asking people to assess their own intelligence, evaluate their attractiveness, or reflect on how kind they are.

survey of American school students, which received over 1 million responses, found that no less than 70% rated themselves as above average leaders. Conversely, only 2% of these students humbly stated that they are actually below average.

This all begs the question: why do we think more highly of ourselves than who we really are?

To explain the prevalence of overconfidence, psychologists have historically pointed to the mental health benefits of positive illusions— such as maintaining self-esteem, or helping reduce anxiety about an uncertain future. In other words, that these biases protect us from threatening information.

However, a more intriguing explanation has been proposed by the evolutionary biologist Robert Trivers.

Robert Trivers is widely known as the ‘bad ass’ of evolutionary biology, for his antics both in and outside of academia. However, this reputation has not diminished his scientific contributions. Steven Pinker has described Trivers as “one of the great thinkers in the history of Western thought”, and Time magazine has named him one of the greatest scientists of the 20th century.

In his book The Folly of Fools, Trivers argues that a glowing view of  the self makes others see us in the same light— leading to more cooperative and romantic opportunities in life. This is because self-deception requires no conscious input, which therefore eliminates any tell tale signs of lying– ultimately making one more convincing when trying to persuade others. In other words, these positive illusions evolved not to protect us from threatening information per se, but rather to help us persuade others of our superiority.

The logic of the theory is as follows: deception is a fundamental aspect of communication. Those of our ancestors who were better at detecting deception were at an advantage, causing an evolutionary arms race in deception and detection. As a result, self-deception evolved to better mask deception, “hiding the truth from yourself to hide it more deeply from others”.

Trivers argues that we fool ourselves in all realms of life— when overestimating our looks or abilities, when justifying our beliefs, or when convincing ourselves that a lie we’ve told is actually true. Apparently, it’s all part of advancing our own agendas.

Although Trivers formulated his theory back in 1970s, only recently has it begun receiving empirical support. Fundamentally, as we don’t have access to people’s private beliefs, we can’t say with certainty whether someone is actually lying or deceiving themselves (from the best of my knowledge, we’re yet to develop a mind-reading machine). This makes testing such a hypothesis particularly tricky.

But it isn’t impossible. Answering Trivers’ call to arms, economists Peter Schwardmann and Joël van der Weele have recently published a paper in the journal Nature Human Behaviour, that successfully tackled this challenge.

Put your money where your mouth is

So, how do you measure people’s beliefs about their own abilities if you’re unable to read their minds? Peter and Joël’s simple yet ingenious idea was this: get people to bet on their own performance.

To elaborate, Peter and Joël conducted experiments where contestants essentially bet on their performance on an intelligence test. This meant that people had money at stake for accurate assessments of their intellectual performance.

However, there was a twist.

To see if introducing opportunities for social gain induced self-deception, some of the participants were invited to a speed-dating style interview with a panel of ’employers’. Here, the contestants were awarded more money if they could convince the employers that they were actually a top performer (whilst still not knowing their results on the intelligence test, which were provided to them at the end of the experiment).

Of course, there’s two sides of the coin— are people actually fooled by overconfidence?

As with the contestants completing their initial assessments, the employers were asked to essentially bet on who the top performers on the intelligence test were. This meant that the employers also had money at stake when deciding who actually performed well (in other words, their bullshit detectors were well calibrated).

The research team had another trick up their sleeves.

Before the interviews, each contestant received cryptic feedback on their performance on the intelligence test, and were then asked to reevaluate their performance in light of this feedback. This allowed the researchers to see if contestant’s overconfidence influenced the employer’s judgements of them— as their feedback had a direct and measurable impact on the contestant’s confidence levels.

Once the interviews came to a close, employers were given a few seconds where they could scrutinise contestants’ body language and facial expressions. Employers were asked to write down what each contestant said, and how well they think each performed on the intelligence test. The employers were also asked to evaluate how honest, likeable, attractive and confident they thought each contestant was.

In total, 688 German students participated in these series of experiments (410 women and 278 men).

Fooling yourself the better to fool others 

So, what did they find?

Peter and Joël found that overconfident contestants received higher evaluations from the employers. In other words, those who privately bet that their performance was relatively high were seen as higher performers on the intelligence test.

The researchers were able to isolate the impact of contestants receiving a noisy yet positive signal before their interviews (that is, cryptic feedback suggesting superior performance). Contestants who received a positive signal subsequently received higher evaluations from employers– even when controlling for their actual test scores. What this means is that overconfidence caused contestants to receive higher ratings from the employers— independent of their actual performance on the intelligence test.

Finally, the contestants invited to the speed-dating style interviews were noticeably more confident than those who didn’t participate. That is, people became overconfident when they were given the opportunity to profit from persuading others.

In summary, Peter and Joël found that overconfidence pays. The contestants who were more confident received higher ratings from the employers, and therefore won more money. Likewise, they showed that opportunities to profit from persuasion induced overconfidence, and that it was overconfidence which led contestants to receive higher ratings— above and beyond their actual performance.

Overall, these results provide compelling evidence in favour of Robert Triver’s theory of self-deception.

Implications

As is always the case, there are aspects of this paper that can be critiqued.

Although overconfidence is a well established finding from the field of psychology, there is evidence suggesting some cultural differences in how overconfidence manifests.

These experiments were conducted with German university students. However, we know that these people are really WEIRD. That is, they are Western, educated, industrialised, rich and democratic, and therefore may not paint a complete picture of humanity overall… Although we can have confidence in these research findings, we don’t want to become overconfident. 

To add more weight to the study, other recently published papers and preprints point in the same direction. However, it remains the case that these experiments would benefit from more diverse samples.

These points aside, this paper arguably has far reaching implications for business.

What this strand of research suggests is that positive illusions are far more pervasive than is usually portrayed by psychologists.

Evidently, positive illusions offer individuals a wealth of benefits. Those of us who are the most optimistic and overconfident reap many benefits in life, including more career success. As noted by Daniel Kahnmenan in Thinking Fast and Slow, entrepreneurs and business leaders are the most optimistic and overconfident among us.

Positive illusions encourage us to take gambles— gambles which can come with big pay offs. But of course, risky bets can also end in bankruptcy. Overconfidence and excessive optimism can lead financial traders to lose money, CEOs to initiate value-destroying mergers, and lenders to invest in businesses that are built on sand.

The key takeaway for me is this: if positive illusions are not so much a means of protecting us from threatening information, but rather a form of self-deception to help us persuade others, then the problem has been misdiagnosed by psychologists and practitioners alike. Initiatives such as teaching people debiasing techniques as a means of overcoming overconfidence are simply not going to work.

These findings also have obvious implications for recruitment and selection. Indeed, the experiment is inadvertently a parody of the interview process itself.

In light of these findings, it’s not too surprising that interviews are a weak predictor of job performance. These series of experiments mark yet another red cross against job interviews as a recruitment method, and lend credence to the practice of blind assessments.

By coincidence, Peter and Joël concluded their paper by discussing the business implications of their research (emphasis added):

One implication of our findings is that overconfidence is likely to be more prevalent in settings in which its strategic value is highest, that is, in cases in which measures of true ability are noisy, competition is fierce and persuasion is an important part of success. It may arise in employer–employee relationships because of its strategic benefits in job interviews and wage negotiations. Arguably, confidence may be even more valuable among the self-employed, whose economic survival often depends more immediately on persuading investors and customers. We would also expect overconfidence to be rife amongst high-level professionals in finance, law and politics.

Peter and Joël’s reflections are likely to resonate with people working in politics or finance. For those of us working in such fields, vigilance is required. For example, when evaluating candidates for leadership positions, it’s important to gauge whether candidates are overconfident. If they appear to be raising the stakes by promising the world, you have to ask yourself: are you really going to win this bet?


Written by Max Beilby for Darwinian Business