In his best-selling book Moneyball, Michael Lewis tells the true story of how the Oakland A’s general manager Billy Beane built a competitive team on a very limited budget by defying the conventional wisdom of baseball experts.
Instead, Beane relied upon an unconventional sophisticated data analytics approach to scout and evaluate players. Despite perennial meagre budgets, in the nineteen years Beane has used data analytics to build teams, the A’s have made the playoffs an impressive nine times.
Beane’s innovative approach has literally become a game-changer as many of baseball’s most talented clubs today are employing the key principle that serves as the foundation for building successful teams: Trust the evidence of data over the opinions of experts.
The story of Moneyball is intriguing because it calls into question the judgments of experts. After all, if the knowledge of experts in a relatively simple industry such as baseball could be significantly improved by an unconventional data analytical approach, could the same be true for more complex human activities? It was a question that many, if not most, of us had not considered before Lewis popularized this story.
After publication, Lewis was surprised to learn that the ideas in his book weren’t as original as he thought. A chiding review in the New Republic pointed out that Lewis did not seem to be aware of the deeper reason for the inherent flaws in expert judgments: the dynamics of thinking in the human mind.
The discovery of the workings of human thinking and how they can enable inefficient and sometimes irrational expert judgments had been described, years prior, by a pair of psychologists, Daniel Kahneman—whose groundbreaking work resulted in a Nobel prize—and his longtime collaborator Amos Tversky.
This review would inspire Lewis to delve into the story of the work of these two psychologists, which would become the subject of his subsequent book, The Undoing Project.
Dual Thinking Modes
Kahneman and Tversky discovered that the human brain is a paradox. While it is capable of producing highly developed analytical and creative intelligence, it is also prone to make apparently senseless errors. Why is this so?
The answer according to the psychologists is that people are nowhere near as rational as they think and are incredibly susceptible to unconscious biases that influence human decision-making to a far greater extent than we realize.
Kahneman and Tversky discovered that people engage in two different thinking modes in their day-to-day lives. They refer to these ways of thinking by the nondescript names System 1 and System 2. System 1 is fast thinking, which operates automatically with little or no effort. It is highly proficient at identifying causal connections between events, sometimes even when there is no empirical basis for the connection.
System 2, on the other hand, is slow thinking and involves deliberate attention to understanding details and the complex web of relationships among various components. Whereas System 1 is inherently intuitive, deterministic, and undoubting, System 2 is rational, probabilistic and highly aware of uncertainty and doubt. Needless to say, these two ways of thinking are contextually very different.
Both System 1 and System 2 thinking are distinctly human capabilities that have provided humanity with an immense evolutionary advantage. We are capable of developing complex intellectual structures such a mathematics, physics, and music via applications of System 2, and, thanks to System 1, humans have the unique capability to make judgments and decisions quickly from limited available information.
However, in employing these two capabilities, Kahneman and Tversky found that, while we may perceive ourselves as predominately rational System 2 thinkers, the reality is most human judgments and decisions are based upon the more intuitive System 1.
System 1 thinking, with its inherent mental biases, is what likely guided the judgments of traditional baseball scouts. Similarly, despite their claims that they are following the science, mental biases may be informing the thinking of the public health experts who are influencing the shaping of public policy in the current Covid-19 pandemic.
And if this is so, it raises the question of whether or not the guidance provided by the experts is another case of humans making confident judgments and decisions from limited available information.
What We Know and Don’t Know
For insight into this question, let’s begin with the data we know and the data we don’t know. We know the number of confirmed cases from those who qualified for one of the limited available tests. We also know the number of Covid-19 hospital admissions, the number of those admissions in intensive care units (ICU’s), and the number of deaths associated with the coronavirus.
We are also aware of critical key trends about who are most vulnerable to the invisible enemy. While people of all ages can be infected with Covid-19, those with an underlying health condition, such as diabetes, hypertension, asthma, heart disease, and obesity are most at risk.
Elderly who have a chronic illness are at particularly high risk. A very important trend that has emerged is that rarely does anyone die from the virus alone; the vast majority of coronavirus deaths involve comorbidities. This may explain why the young, who have far fewer chronic conditions, are generally not at high risk when infected with the virus.
While this information is very helpful in informing us about the behavior of the infection once it is confirmed, there is a critical data gap that prevents us from accurately gauging the extent of the actual threat of Covid-19 upon the general population. And this missing piece of data is perhaps the most important number needed to shape public policy for effectively combating the invisible enemy: the actual number of people infected.
This number is critical because it is the foundational denominator necessary for calculating accurate data. Without this number, we do not know the true infection rate, nor the asymptomatic incidence rate, the mild symptom rate, the severe symptom rate, and most importantly, the true death rate.
In a recent New York Times editorial, professor Louis Kaplow of Harvard University urged decision-makers to take the most important step in filling this data gap: Perform large-scale national testing on a stratified random sample of the U.S. population.
Until we know the actual infection rate, according to Kaplow, “We are flying blind in the fight against Covid-19.” Kaplow makes the point that, “Random population testing is the key to unlocking the mysteries surrounding Covid-19.” It is also the key to enabling a rational System 2 solution to a very complex problem.
As long as we make no effort to know the actual infection rate and continue to monitor only the biased confirmed case rate, we are indeed flying blind—a clear sign our decision-makers are informed by System 1.
Without the true denominator, we don’t know whether the death rate is 2.0 per cent or 0.2 per cent, which could make a big difference in how we make policy decisions. In the absence of this key data, there is understandably a natural bias on the part of the public health experts and government leaders to err on the safe side by promoting one-size-fits-all mandates.
Unfortunately, one of the consequences of mandates is that they create a clash between medical safety and economic safety. But what if we could find a solution that accomplished both? This would likely require a different way of thinking, and, almost certainly, a different strategy for combatting the invisible enemy.
The New Rules of Engagement
The need to think differently was a lesson learned in a very different arena when General Stanley McChrystal was engaged with a different kind of enemy. Upon taking command of the Joint Special Operations Task Force in Iraq in 2003, McChrystal realized he had a perplexing problem. Despite having the most highly trained and efficient military force in the world and a sizable advantage in numbers and equipment, his troops could not match Al Qaeda’s speed and adaptability.
While the general’s army was organized into a highly efficient centralized hierarchy of thousands of disciplined soldiers, the terrorists were a decentralized network of local units that could move quickly, strike brutishly, and become hidden in plain sight by blending into the local population.
McChrystal quickly recognized, if his army was going to subdue the military’s version of an invisible enemy, he would have to solve their adaptability problem. Because efficient practices are often barriers to adaptability, the general knew becoming more agile would mean abandoning centuries of conventional wisdom and adopting new rules of engagement. These new rules would be guided by a key principle that emerged as an observation from McChrystal’s leadership team: It takes a network to defeat a network.
In considering how to transform his troops into an adaptable network, McChrystal recognized that he would need to radically change his mental model about how organizations work. The hierarchical model McChrystal was accustomed to assumes organizations are most effective when they are designed to leverage the individual intelligence of the elite few at the top.
Thus, according to McChrystal, hierarchical leaders are trained to think like chess masters using their individual talents and skills to exert control over their opponents. In hierarchies, the power of an organization is a derivative of the skilful exercise of judgment by those in charge.
The network model reflects a very different set of organizing principles that assumes social systems work like organic ecosystems. In networks, power is a derivative of the strength of the connections among all the participants.
Unlike hierarchies that leverage the intelligence of the elite few, effective networks have the capacity to leverage the collective intelligence of the many and create a shared understanding across the organization. This shared understanding provides the wherewithal for participants to autonomously adapt to unpredictable circumstances.
Transforming his organization into a superior network meant that McChrystal had to become a different kind of leader. In his words, he needed to stop playing chess and become a gardener. He needed to shift his focus from moving pieces on a board to shaping an ecosystem.
He accomplished this by structuring his troops into fractals of relatively autonomous teams, with select members from each team participating on intersecting teams to form a highly connected network that enabled a transparent shared understanding across the troops. The leveraging of this shared understanding greatly increased the capacity of the army to add the advantages of speed and adaptability to their existing resource superiority in prevailing over the enemy.
Slipping Into Default Modes
One of the most striking developments of the coronavirus pandemic is how quickly democratic societies morphed, however unwittingly, into authoritarian states. The idea that our entire lives could be completely disrupted by an invisible enemy was something that most of us never saw coming.
While the infectious disease specialists had been warning us for decades that the sudden emergence of a pandemic was inevitable, their admonishments fell on deaf ears. So, it was not surprising that, when the world was suddenly overwhelmed by the exponential growth of deaths from this novel coronavirus, we immediately turned to the public health experts to navigate us through this unwelcome pandemic.
The experts’ guidance was swift and clear. To slow the spread of the virus we needed to put in place a mass application of social distancing. Because human-to-human contact is what drives the exponential growth of infections, the best way to slow the spread, according to the experts, was to drastically reduce human density.
Accordingly, government leaders around the world closed stadiums, arenas, theatres, restaurants, bars, gyms, churches, synagogues, and temples. They instituted population lockdowns to drastically reduce contact among people. In the U.S., the governors of the various states assumed almost dictatorial powers as they issued stay-at-home orders, closed down businesses, put people out of work, and limited basic civil liberties without any regard for constitutional due process rights.
Perhaps what is most amazing—and some might say most alarming—is how readily the vast majority of us have accepted this sudden shift to a form of medical martial law as our governors, guided by a public health elite, thoroughly defaulted to the tools and practices of command-and-control management.
When a whole nation is overtaken by the fear of the unknown, its people and its leaders can easily slip into the default modes of System 1 thinking and its affiliate, command-and-control management. That’s what seems to have happened in the U.S. in response to the Covid-19 pandemic as democratic principles have been supplanted by the dictates of command-and-control authorities.
Unfortunately, despite their good intentions and their vast experience, by slipping into the two default modes, our government leaders and the public health experts have run the risk of doing us more harm than good. Unless every trace of the virus is eradicated, command-and-control strategies are ultimately as ineffective against Covid-19 as General McChrystal found they were against Al Qaeda. Without effective treatments, measures aimed at controlling people are likely no more than temporary stop-gap measures.
As governors and public health experts continue in their valiant efforts to fight the invisible enemy, it might be wise for them to consider following the example of General McChrystal and stop playing chess and become gardeners.
Without treatments or a vaccine, they will never be able to control the virus because Covid-19 is an exponential network. The continued use of a command-and-control management strategy against Covid-19 is likely to fail because it really does take a network to defeat a network.
Framing and Decision Making
McChrystal was able to change the way he exercised leadership because he had the wherewithal to reframe his thinking to adapt to the reality on the ground rather than to refashion reality to fit a prescribed mental model. Likewise, because Billy Beane accepted that he could not change the reality of his available budget, he chooses to reframe his thinking to create a game-changing solution for building a baseball team.
For both of these leaders, the ability to think differently made all the difference because, as Kahneman and Tversky discovered in their groundbreaking research, how we frame a situation heavily influences how we decide between alternative courses of action.
The two psychologists applied the label of “framing effects” to what they described as the unjustified influences of formulation on beliefs and preferences. Kahneman and Tversky noticed in their experiments that people did not choose between things, but rather, they choose between descriptions of things. Thus, by simply changing the framing—the description of a situation—they could cause people to completely flip their attitude on how to respond to the situation.
For example, in an experiment conducted at the Harvard Medical School, Tversky divided physicians into two groups. Each group was given statistics about the five-year survival rates for the outcomes of two treatments for lung cancer: surgery and radiation.
While the five-year survival rates were clearly higher for those who received surgery, in the short term surgery was riskier than radiation. The two groups were then given two different descriptions of the short term outcomes and were asked to choose the preferred treatment.
The first group was given the survival rate: The one-month survival rate is 90%. The second group was given the corresponding 10% mortality rate. Although these two descriptions are logically equivalent, 84% of physicians in the first group choose surgery while the second group was split 50%/50% between the two options.
If the preferences were completely rational, the physicians would make the same choice regardless of how the descriptions were framed. However, System 1 thinking is not rational and can be swayed by emotional words. Thus, while 90% survival sounds promising, 10% mortality is shocking. This experiment showed that physicians were just as vulnerable to the framing effect as hospital patients and business school graduates. As Kahneman observed, “Medical training is, evidently, no defence against the power of framing.”
This troubled Kahneman who noted, “It is somewhat worrying that the officials who make decisions that affect everyone’s health can be swayed by such a superficial manipulation—but we must get used to the idea that even important decisions are influenced, if not governed, by System 1.”
Reframing the Coronavirus
When the coronavirus first erupted into our lives, it was immediately framed as a public health crisis, which is why the guiding expertise for handling the pandemic came from the public health community.
But what if we had reframed the problem differently and treated Covid-19 not as a public health crisis, but rather as a social system crisis? This reframing would have almost certainly given greater voice to advisors who were not public health experts.
If the primary advisors were a diversity of contributors that included intensive care medical professionals, social psychologists, sociologists, economists, mental health professionals, small and large business managers, and legal scholars as well as public health experts, there would have been a greater opportunity to formulate a holistic solution to address the many concurrent dimensions of this social system crisis. They almost certainly would have insisted upon performing the random sample that Kaplow suggested.
It is likely that a diversified group would have focused on balancing three goals. The first goal, of course, would be to minimize the number of deaths from Covid-19. A second goal would be to minimize the number of unintended deaths that might result from actions taken to curb the virus. And, finally, a third goal would almost certainly be to maintain social and economic cohesion to the fullest extent possible.
If we had entrusted the crafting of the strategy to solve what is certainly the greatest social system crisis of our lifetimes to a diversified group of contributors, we would have likely not defaulted to System 1 thinking and command-and-control management. We would have been able to avoid the hazards of experts by enabling System 2 thinking to understand all aspects of a complex social problem.
More importantly, our leaders would have had the opportunity to become gardeners rather than chess masters by putting aside one-size-fits-all simplistic mandates and formulating a more intelligent solution that trusts the evidence of data over the opinions of experts and understands that it really does take a network to defeat a network.
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox