Created by Materia for OpenMind Recommended by Materia
4
Start On Hating Surprises. How Organizations Can Minimize Uncertainty
01 December 2014

On Hating Surprises. How Organizations Can Minimize Uncertainty

Estimated reading time Time 4 to read

In 2000, scientists announced a major advance in gene sequencing technology.  It was seen as a big step towards new gene-­targeted drugs that could cure a host of diseases.  At the time, I ran a multi-‐billion dollar portfolio at one of the larger global asset managers. The firm immediately called for a special meeting of portfolio managers and healthcare analysts to identify the investment implications. In the hours that followed, we hammered out a vision of a promising healthcare future and its many investment opportunities.

Most of our predictions turned out to be wrong

The slew of new drugs tailored to fix the genetic causes of disease never arrived.  There was something we failed to take into account. This same thing led to ISIS surprising intelligence analysts in 2014 and subprime losses surprising bond traders in 2007. In fact it affects forecast after forecast, from Ebola contagion rates to European sovereign debt losses.  If we could get a handle on this forecast marauder, our organizations would be better off.  Mainly, we would get surprised less. Everyone in the prediction business, and that is everyone from a bond trader to an intelligence analyst to a cell biologist, hates surprises.

Surprises, though, do not equate to “Black Swans”.  Unfortunately, that term has been twisted into a generic excuse for being caught unawares. “It was a Black Swan!  Did you really think I could predict it?” No, our forecast marauder was not a Black Swan.  Instead, as scientists learned more about the genome, they discovered few simple gene-­disease relationships.  Instead, a number of “regulator” genes trigger other genes, and they, in turn, produce cell responses that can lead to illness. Picture a spaghetti-­strand network of complex connections between different genes and diseases. This was not the simple genome that drug researchers expected to find.

 In short, our predictions were done in by complexity

Most of what we call uncertainty arises from similar, complex interaction between parts of a system.  Whether the system is political, economic or biological, the same rule applies:

 <<Complexity manufactures uncertainty>>

How do Black Swans relate to complexity? The metaphor describes a time when Aussie settlers come across a previously unseen black swan, thus destroying the prediction, “swans equal white”.  This is an example of a type of uncertainty called incomplete information. Non-‐white swans existed all along; we just didn’t know it. The point is, we usually get surprised not by what exists unbeknownst to us, but instead by what emerges from complex interactions.

Fortunately, scientists have been busy trying to understand complexity. Their body of work is known as Complex Adaptive Systems. It strings together fields as diverse as network science, neuroscience, ecology, economics, medicine and urban planning. The common strand is the existence of interacting parts producing surprising results.

Complexity theory is not entirely new.  In fact, in some ways, it is unfashionably old. Economists like Keynes and Hayek were preoccupied with complexity and uncertainty. Their experience with war and depression gave them a healthy concern for the unknown. Today’s economists have enjoyed far more stability. As a result, they are comfortable theorizing that we have perfect rationality and information in our decision‐making.  There is little room in their simple, clockwork universe for uncertainty spawned by complexity.

This “clockwork” economics mindset underpins much of our organizational decision-­making.  As managers and analysts, we are trained to act as rational agents charged with calculating payoffs and probabilities. Our predictions might be wrong, but the probability of error and range of outcomes is known: we call it “risk”. Intelligence assessments, fiscal budgets, strategic plans: these all follow the same economic logic of predicting payoffs and managing risk. Where in the process do they manage uncertainty? Sometimes they list possible “low probability events”. This is known as scenario planning. The problem there is often no exploration of the connections between possible events, of the chains of complex cause and effect. The scenario list is an add‐on to the rational actor process, an afterthought often left forgotten.

A complex systems mindset can do better than that. The reason is that complexity tends to produce recurrent patterns. We don’t know exactly what form the pattern will take in its next iteration, but we can grasp its general outline. An example:

ISIS surprised the intelligence community. The group stormed out of its Syrian base to quickly capture a third of Iraq. The complex systems concept of evolutionary adaptation would have been helpful in predicting their rise.  Ecosystems avail a series of niches to inhabitants. Tall trees, for instance, offer abundant food for animals that can adapt by evolving longer necks. The mid­‐East landscape offered such a niche to Islamist militants: the “tall trees” were Sunni towns chafing under majority-­Shia rule. The “long neck” was openly taking territory instead of engaging in stealth terrorist attacks. A successful territory­‐taker would enjoy a feedback loop: the first conquest would attract followers, enabling yet more conquests and attracting more followers. The key question to ask was, “Which militant group is most likely to adopt the territory­‐taking adaptation?” Posing it would have suggested a different surveillance and intelligence approach, one with a niche and its implied strategy in mind.

In short, an ecological approach, one that recognized a complex system pattern, would have improved monitoring of groups like ISIS. This is just one example, but there are many more, in fields ranging from finance to public policy. In applying these patterns, the connection between cause and effect will not be obvious.  The job of complex systems thinking is to uncover the non­‐obvious connections.  This is a lot harder, but not near as hard as dealing with the next surprise.

Diego Espinosa

Comments on this publication

Name cannot be empty
Write a comment here…* (500 words maximum)
This field cannot be empty, Please enter your comment.
*Your comment will be reviewed before being published
Captcha must be solved