Join us for networking & quality resources to help you and your team succeed in digital transformation.
Here are the cognitive biases I encounter most frequently in my line of work or that have the greatest impact on my work, even if I don’t bump into them that often.
Okay, they’re not my favorite ones, but the ones I bump into the most or that affect my job the most.
Cognitive biases are sometimes hyped to the point of meaninglessness in the business media and blogosphere. Veteran marketing researchers will note that Daniel Kahneman and Amos Tversky, two academics frequently mentioned in connection with the term and with behavioral economics, began publishing on this topic in the early ‘70s. Moreover, those raised in Western nations may remember when “rationalization” and “cognitive dissonance” were colloquial terms for many of these same biases.
In short, much of this is old news. Anyone who has taken a psychology course knows that humans are not computers, and an even superficial knowledge of human history makes it clear that logic and evidence have never been the favorite tools of decision-makers. How many of the biases I linked above would have come as a surprise to James Madison, William Shakespeare, Julius Caesar, or Alexander the Great?
I believe it’s good to be reminded of the flaws in our reasoning. However, we should bear in mind that many of these biases are not inherently irrational but, rather, are decision heuristics that are part of our evolutionary heritage. In a sense, they’ve stood the test of time. Even mammoth business decisions in the modern world are often made largely on the basis of gut feel, individual or collective, though they may appear “scientific” after the fact through the clever use of financial accounting…and because of cognitive biases.
Anchoring: The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject).
Automation bias: The tendency to depend excessively on automated systems which can lead to erroneous automated information overriding correct decisions.
Availability heuristic: The tendency to overestimate the likelihood of events with greater “availability” in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.
Bandwagon effect: The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.
Belief bias: An effect where someone’s evaluation of the logical strength of an argument is biased by the believability of the conclusion.
Confirmation bias: The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.
Courtesy bias: The tendency to give an opinion that is more socially correct than one’s true opinion, so as to avoid offending anyone.
Dunning-Kruger effect: The tendency for unskilled individuals to overestimate their own ability and the tendency for experts to underestimate their own ability.
Experimenter’s bias: The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.
Focusing effect: The tendency to place too much importance on one aspect of an event.
Framing effect: Drawing different conclusions from the same information, depending on how that information is presented.
Functional fixedness: Limits a person to using an object only in the way it is traditionally used.
Hindsight bias: Sometimes called the “I-knew-it-all-along” effect, the tendency to see past events as being predictable at the time those events happened.
Illusory correlation: Inaccurately perceiving a relationship between two unrelated events.
Insensitivity to sample size: The tendency to under-expect variation in small samples.
Law of the instrument: An over-reliance on a familiar tool or methods, ignoring or under-valuing alternative approaches. “If all you have is a hammer, everything looks like a nail.”
Mere exposure effect: The tendency to express undue liking for things merely because of familiarity with them.
Neglect of probability: The tendency to completely disregard probability when making a decision under uncertainty.
Observer-expectancy effect: When a researcher expects a given result and therefore unconsciously manipulates an experiment or misinterprets data in order to find it.
Optimism bias: The tendency to be over-optimistic, overestimating favorable and pleasing outcomes.
Outcome bias: The tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
Planning fallacy: The tendency to underestimate task-completion times.
Pro-innovation bias: The tendency to have an excessive optimism towards an invention or innovation’s usefulness throughout society, while often failing to identify its limitations and weaknesses.
Recency illusion: The illusion that a word or language usage is a recent innovation when it is in fact long-established.
Selective perception: The tendency for expectations to affect perception.
Social desirability bias: The tendency to over-report socially desirable characteristics or behaviors in oneself and under-report socially undesirable characteristics or behaviors.
Status quo bas: The tendency to like things to stay relatively the same.
Stereotyping: Expecting a member of a group to have certain characteristics without having actual information about that individual.
Subjective validation: Perception that something is true if a subject’s belief demands it to be true. Also assigns perceived connections between coincidences.
Survivorship bias: Concentrating on the people or things that “survived” some process and inadvertently overlooking those that didn’t because of their lack of visibility.
Third-person effect: Belief that mass communicated media messages have a greater effect on others than on themselves.
Any of these biases may rear their heads anytime and need not be related to consumer behavior or marketing itself to make their presence felt. Decision makers are human…Understanding these biases, as well as being aware of our own cognitive shortcomings, can help us respond effectively or, with a bit of experience, take precautions that reduce their impact.
There are many other biases listed in Wikipedia entry I linked in the opening paragraph that may be more pertinent to what you do. There are also the best-selling books Thinking, Fast and Slow by Daniel Kahneman and Predictably Irrational by Dan Ariely, and many more books and articles by leading scholars. Psychologist Gerd Gigerenzer takes a somewhat different tack but is also an excellent source about how people make decisions.
David McCaughan and I had a very enlightening discussion with Bri Williams, who has authored several books about Behavioral Economics, on our audio podcast series MR Realities. If you’d like to listen, click on Where Behavioral Economics Fits in the Customer Insight Landscape. Terry Grapentine also has written extensively about decision making and joined Dave and me for a fascinating discussion, Thinking Mistakes Marketers Make.
I hope you’ve found this interesting and helpful!
Article by channel:
Everything you need to know about Digital Transformation
The best articles, news and events direct to your inbox