16 Critical Cognitive Biases (Plus Key Academic Research)

November 12, 2021  |  By: Irrational Labs

Wikipedia lists over 200 common cognitive biases (aka psychologies). In our behavioral science training courses, we explain how leveraging cognitive biases can help you solve your most pressing personal and professional challenges. For a cliff notes version, our team has selected the following “Sweet Sixteen” list as a high-value collection worth understanding.

The 16 Critical Cognitive Biases (Plus Key Academic Research)

PERCEIVED COSTS AND BENEFITS ATTENTION AND EFFORT COGNITIVE HEURISTICS
1. PRESENT BIAS

2. INCENTIVES

3. REWARD SUBSTITUTION

4. GOAL GRADIENTS

5. COGNITIVE OVERLOAD

6. LIMITED ATTENTION

7. STATUS QUO BIAS

8. MENTAL MODELS

9. OPTIMISM BIAS

10. AVAILABILITY BIAS

RISK AND UNCERTAINTY CHOICE ARCHITECTURE NORMS AND INFLUENCE
11. LOSS AVERSION

12. ENDOWMENT EFFECT

13. INFORMATION AVERSION

14. ANCHORING EFFECT

15. DECOY EFFECT

16. SOCIAL NORMS

NOTE: Some of the studies below will be readily available on the internet (Pro Tip: search for the article title and add “PDF” to your search). Others may be behind a paywall / only available via libraries. 

Perceived Costs and Benefits

Present Bias

Explanation: We live in the here and now, it makes sense that we’d value the present over the future. This tendency, called present bias, is also known as hyperbolic discounting. It explains why people put an unrealistically high value on what happens in the present and an unrealistically low value on the future. We often choose smaller, immediate rewards or benefits even at the expense of larger, more significant rewards that occur later in the future. As a result, people tend to have a hard time delaying gratification: it’s difficult to choose to do something that’s good for us in the future at the expense of doing what feels good right now.

Example: A very common example of present bias is the preference to sleep in for just a few extra minutes instead of waking up to exercise, even if we’ve just had a good night’s sleep. Even though the physical and mental benefits of exercise should, rationally, outweigh the minute benefits of an extra thirty minutes of snoozing, it’s often too hard for us to convince our present (sleepy) selves.

Key Academic Research:

Incentives

Explanation: Incentives are rewards used to motivate behavior. Incentives may be either extrinsic (a physical or monetary reward) or intrinsic (an internal goal). In contrast to traditional economic theory, behavioral economics posits that small extrinsic and intrinsic incentives can have a disproportionate impact on behavior, especially in the face of present bias. Incentives can take the form of temptation bundling (as described above); extrinsic rewards including direct monetary incentives and lotteries; or intrinsic rewards, such as those that leverage one’s sense of pride or mastery. Providing incentives can be an effective way to jumpstart behavior change and are often most effective with long-term behavior change strategies.

Example: Imagine a behavioral designer were starting a program to encourage people to stop smoking. The designer might use extrinsic incentives, such as offering people a small amount of money for each day they didn’t smoke or entering them into a lottery where they have a chance of receiving prize money. They also might leverage loss aversion in this incentive programme: offering people a set sum of money upon completion of the program but deducting money each time a person smokes. Or, they might use intrinsic rewards: providing people with an opportunity to track their progress against desired milestones, and encourage people to reflect on their progress. 

Key Academic Research:

Reward Substitution

Explanation: Reward substitution is a behavioral mechanism used to spur action—people can be motivated to work toward a distant, long-term goal by being rewarded for an associated, immediate behavior. Reward substitution is an effective mechanism for overcoming present bias; by experiencing an immediate reward, we are encouraged to take actions which might feel painful or costly in the moment but have future benefits. 

Example: One form of reward substitution is known as temptation bundling. For example, in order to motivate ourselves to complete a presently unpleasant task (like running on a treadmill), we can reward ourselves with something fun (like watching Netflix while running on the treadmill). 

Key Academic Research:

  • Milkman, K. L., Minson, J. A., & Volpp, K. G. (2014). Holding the Hunger Games Hostage at the Gym: An Evaluation of Temptation Bundling. Management science, 60(2), 283–299. https://doi.org/10.1287/mnsc.2013.1784
  • Kirgios, E.L., Mandel, G.H., Park, Y., Milkman, K.L., Gromet, D.M., Kay, J.S., & Duckworth, A.L. (2020). Teaching temptation bundling to boost exercise: A field experiment. Organizational Behavior and Human Decision Processes, 161, 20-35. https://doi.org/10.1016/j.obhdp.2020.09.003
  • Chance, Z., Gorlin, M., & Dhar, R.K. (2014). Why Choosing Healthy Foods is Hard, and How to Help: Presenting the 4Ps Framework for Behavior Change. Customer Needs and Solutions, 1, 253-262. https://doi.org/10.1007/s40547-014-0025-9

Goal Gradients

Explanation: People favor seeing goals broken down into small steps because it provides a sense of tangible progress. Goal gradients also allow for near misses, an effect where getting close still provides some pleasure. People will work harder and accelerate their behavior to achieve a goal as they get closer toward reaching it. Endowing progress towards a goal—for example, by demonstrating to customers that they have already completed a first step—can motivate further action. The use of goal gradients in design is one way of providing intrinsic incentives.

Example: As an example, language learning apps often use goal gradients by showing learners how many lessons they’ve completed thus far and how far they have to go. 

Key Academic Research:

  • Hull, C. L. (1932). The goal-gradient hypothesis and maze learning. Psychological Review, 39(1), 25–43. https://doi.org/10.1037/h0072640
  • Cryder, C. E., Loewenstein, G., & Seltman, H. (2013). Goal gradient in helping behavior. Journal of Experimental Social Psychology, 49(6), 1078–1083. https://doi.org/10.1016/j.jesp.2013.07.003
  • Kivetz, R., Urminsky, O., & Zheng, Y. (2006). The goal-gradient hypothesis resurrected: Purchase acceleration, illusionary goal progress, and customer retention. Journal of Marketing Research, 43(1), 39–58. https://doi.org/10.1509/jmkr.43.1.39
  • Nunes, J. C., & Drèze, X. (2006). The endowed progress effect: How artificial advancement increases effort. Journal of Consumer Research, 32(4), 504–512. https://doi.org/10.1086/500480

Attention and Effort

Cognitive Overload

Explanation: People only have a limited amount of attention and mental bandwidth to devote to decision-making at any given moment. When given too many options or when faced with a complex or uncertain choice, we face what is known as cognitive overload. In these situations, we often procrastinate or opt out of taking action. We’re hard-wired to make decisions in ways that limit the amount of cognitive effort required—to take the easiest option available to us, which is often to take no action at all. Cognitive overload can occur at any time when we are faced with multiple, complex choices but is especially prevalent in contexts of stress, scarcity, and uncertainty.

Example: An example of cognitive overload that we’ve seen in our work is in the experience of students who are eligible for federal financial aid for higher education. Current and aspiring students are often stressed and overwhelmed by the multitude of decisions they must make at the start of each term, from what classes to take to where to live. The process for applying for federal aid, or FAFSA, requires not only filling out complex forms but also making an active decision about whether one is eligible for aid and whether they should apply. Often, due to cognitive overload, students who are eligible for significant financial aid benefits simply don’t fill out the required forms.

Key Academic Research:

Limited Attention

Explanation: Our attentional resources are extremely limited. Related to cognitive overload, people can only focus on a limited number of things at a time, which means important details can be missed. Limited attention can lead to tunneling. When our focus is on a specific task or concern, we tend to develop “tunnel vision”—we neglect details beyond what we are focused on, or anything “outside the tunnel.” In a famous example, study participants who are asked to count the number of times a ball is passed in a short video completely miss the person who walks through the scene in a gorilla costume.

Example: There are numerous examples of limited attention, many of which are seen in our everyday lives. People are famously bad at multitasking—drivers who are using their phone are more likely to get in car accidents, and doctors who are focused on a particular symptom reported by a patient may miss other important signs and symptoms. Our limited attention often prevents us from completing intended behaviours simply because we have forgotten—simple, timely reminders have been effective at driving numerous behaviors from medication adherence to savings deposits.

Key Academic Research:

Status Quo Bias

Explanation: People have a natural bias towards the present state of affairs: Among a set of options, we prefer to stick with the status quo or choose the most automatic option. We avoid taking action whenever possible and tend to stick with whatever the default option is.

Example: Status quo bias can be seen in all kinds of situations. One common example is seen in global organ donation rates. While the majority of people in most North American and European countries approve of voluntary organ donation following fatal accidents, actual rates of donation vary drastically by country. Research has shown a primary difference in national takeup rates is whether countries offer “opt in” or “opt out” organ donation programs. Those who have to take even minimal steps to manually “opt in” to an organ donation program neglect to do so, even if they are just as likely to express willingness to participate in organ donation programs as those in other countries.

Key Academic Research:

Cognitive Heuristics

Mental Models

Explanation: A mental model describes a person’s conceptualization of how something works, or, in other words, their understanding of the surrounding world. We hold mental models of products, services, processes, institutions, and even other people. The mental models that we each hold shape our expectations, actions and social behaviors.

Example: One classic example of a mental model impacting behavior came from our understanding of disease in the 1800s. At the time, people held the mental model that disease was caused by foul-smelling air—a reasonable theory based on the fact that death and decay smelled pretty horrible. With this mental model in mind, people attempted to avoid bad smells, rotting vegetation or people they deemed sickly; they also attempted to remove disease by dumping waste (including human waste) into the river and thus removing the bad smell. However, this mental model was devastatingly incorrect, and this attempt at waste removal increased the spread of water-borne diseases, including cholera. 

Creating a mental model can be an effective way to build people’s expectations for how they should relate to a product, experience or opportunity. Product managers should be aware of the mental model of their product that they set on day 1 of a user’s experience.

Key Academic Research:

  • Johnson-Laird P. N. (2010). Mental models and human reasoning. Proceedings of the National Academy of Sciences of the United States of America, 107(43), 18243–18250. https://doi.org/10.1073/pnas.1012933107
  • Mathieu, J. E., Heffner, T. S., Goodwin, G. F., Salas, E., & Cannon-Bowers, J. A. (2000). The influence of shared mental models on team process and performance. The Journal of applied psychology, 85(2), 273–283. https://doi.org/10.1037/0021-9010.85.2.273
  • Vink, J., Edvardsson, B., Wetter-Edman, K., & Tronvoll, B. (2019). Reshaping mental models – enabling innovation through service design. Journal of Service Management, 30, 75-104.
  • Liu, S. & Lin, H. (2015). Exploring Undergraduate Students’ Mental Models of the Environment: Are They Related to Environmental Affect and Behavior? The Journal of Environmental Education, 46:(1), 23-40, https://doi.org/10.1080/00958964.2014.953021

Optimism Bias

Explanation: On the whole, people display a remarkable optimism bias—we overestimate the likelihood of positive events and underestimate the likelihood of negative events. This optimism is a form of overconfidence; we assume a high likelihood of success in the job market and in romantic relationships, and we fail to anticipate the possibility of negative outcomes like being diagnosed with cancer, getting in a car accident, or suffering a divorce.

Example: A common manifestation of optimism bias is people’s tendency to neglect taking preventative actions against negative life events, such as failing to buy health or home insurance or failing to take preventive health measures.

Key Academic Research:

Availability Bias

Explanation: Things that come to mind more easily (more “available”) are perceived as occurring more frequently. Things become more available in our memories if they are salient, familiar, repeated, or have occurred recently. This cognitive heuristic is known as availability bias. This availability bias can influence our perception of risk—we overestimate the risk of events that have occurred recently or which stand out as especially salient.

Example: In one specific example of availability bias, known as hurricane effect, people have a tendency to purchase comprehensive home or flood insurance immediately after hurricanes occur when the incident is fresh in their mind. If a hurricane does not occur within the first year of insurance purchase, people often overreact to this perceived lack of risk and dramatically reduce their coverage.

Key Academic Research:

  • Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. https://doi.org/10.1016/0010-0285(73)90033-9
  • Pachur, T., Hertwig, R., & Steinmann, F. (2012). How do people judge risks: availability heuristic, affect heuristic, or both?. Journal of experimental psychology, Applied, 18(3), 314–330. https://doi.org/10.1037/a0028279
  • Kliger, D., & Kudryavtsev, A.A. (2010). The Availability Heuristic and Investors’ Reaction to Company-Specific Events. Journal of Behavioral Finance, 11, 50 – 65.

Risk and Uncertainty

Loss Aversion

Explanation: Loss aversion refers to the fact that people are more sensitive to the prospect of a loss than a gain of equal value. As a result, people react to losses more strongly than gains and they try to prevent losses more than they try to achieve gains. Framing decisions in terms of what one stands to lose can encourage people to take action.

Example: Because loss aversion means that we feel losses more than we feel an equivalent gain; losing a $100 bill feels much worse than finding one. Behavioural interventions to encourage medication adherence have been effective at leveraging loss aversion. One solution asks people to make a deposit which they will lose if they do not adhere to their medication. This has proven more effective than offering people equivalent incentives.

Key Academic Research:

  • Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47, 263-291. https://doi.org/10.2307/1914185
  • Wang, M., Rieger, M. O., & Hens, T. (2017). The impact of culture on loss aversion. Journal of Behavioral Decision Making, 30(2), 270-281. https://doi.org/10.1002/bdm.1941
  • Moshinsky, A., & Bar-Hillel, M. (2010). Loss aversion and status quo label bias. Social Cognition, 28(2), 191–204. https://doi.org/10.1521/soco.2010.28.2.191
  • Riegel, B., Stephens-Shields, A., Jaskowiak-Barr, A., Daus, M., & Kimmel, S. E. (2020). A behavioral economics-based telehealth intervention to improve aspirin adherence following hospitalization for acute coronary syndrome. Pharmacoepidemiology and drug safety, 29(5), 513–517. https://doi.org/10.1002/pds.4988

Endowment Effect

Explanation: The endowment effect describes the tendency of people to overvalue what they already own, regardless of market value, as compared to alternatives. If asked to sell an item, a person will focus heavily on the values associated with owning the item—and thus, the cost of surrendering it—while buyers focus primarily on the opportunity cost of buying the item.

Example: In a classic experiment of the endowment effect, two groups of students were given equally priced items (coffee mugs or bars of chocolate) and were offered the opportunity to exchange these items. The vast majority of students in each group opted to keep their own item. A third (control) group of students was allowed to select which item they’d like, with approximately 50% of this group preferring each item, as one would expect. The endowment effect is also a primary driver behind companies sending customers free items, with the option to return them—most customers never do.

Key Academic Research:

  • Kahneman, D., Knetsch, J., & Thaler, R. (2000). Experimental Tests of the Endowment Effect and the Coase Theorem. In C. Sunstein (Ed.), Behavioral Law and Economics (Cambridge Series on Judgment and Decision Making, pp. 211-231). Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9781139175197.009
  • Carmon, Z., & Ariely, D. (2000). Focusing on the forgone: How value can appear so different to buyers and sellers. Journal of Consumer Research, 27(3), 360–370. https://doi.org/10.1086/317590
  • Tom, G., Lopez, S., & Demir, K.D. (2006). A comparison of the effect of retail purchase and direct marketing on the endowment effect. Psychology & Marketing, 23, 1-10. http://doi.org/10.1002/MAR.20107

Choice Architecture

Information Aversion

Explanation: People sometimes avoid potentially useful information that is costless to acquire out of a fear that the information will be negative. We often choose to remain ignorant because we’re afraid of disappointment or bad news. Information aversion is sometimes called the ostrich effect—as this tendency to willfully turn away from helpful information evokes the notion of an ostrich sticking his head in the sand.

Example: Information aversion is seen in numerous scenarios, but common examples include people’s active avoidance of medical tests or physical check-ups due to fear of a poor diagnosis, as well as people’s avoidance of financial information that will confirm their currently poor financial status. We’re much less likely to check our credit scores when we know that we won’t like what we see.

Key Academic Research:

  • Karlsson, N., Loewenstein, G., & Seppi, D. (2009). The ostrich effect: Selective attention to information. Journal of Risk and Uncertainty, 38(2), 95–115. https://doi.org/10.1007/s11166-009-9060-6
  • Webb, T. L., Chang, B. P., & Benn, Y. (2013). ‘The ostrich problem’: Motivated avoidance or rejection of information about goal progress. Social and Personality Psychology Compass, 7(11), 794-807. https://doi.org/10.1111/spc3.12071
  • Chang, B. P., Webb, T. L., Benn, Y., & Stride, C. B. (2017). Which Factors Are Associated with Monitoring Goal Progress?. Frontiers in psychology, 8, 434. https://doi.org/10.3389/fpsyg.2017.00434
  • Galai, D., & Sade, O. (2005). The ‘Ostrich effect’ and the relationship between the liquidity and the yields of financial assets. SSRN Electronic Journal, 79(5), 2741-2759. https://doi.org/10.2139/ssrn.666163
  • Carlson, E. N. (2013). Overcoming the barriers to self-knowledge. Perspectives on Psychological Science, 8(2), 173-186. https://doi.org/10.1177/1745691612462584

Anchoring Effect

Explanation: Our decisions and judgments are often not based on subjective quality but rather in comparison to a reference point, a concept known as relativity. A common bias related to relativity is known as the anchoring effect. When making decisions, we tend to anchor—or rely too heavily—on a single piece of information (usually an initial piece of information or a piece of information we have recently received). We interpret newer information relative to the reference point of our anchor, rather than objectively. This effect seems to hold even when a piece of information is irrelevant to the decision being made. In one well-known study, researchers asked people to report their social security number and then asked for their willingness to pay for various items. People with higher digits in their social security numbers reported higher willingness to pay.

Example: Our willingness to pay for various items is frequently impacted by an initial anchor price. On charitable giving platforms, non-profits will often suggest a few donation options. The higher the minimum suggested donation, the more we are generally willing to donate. Judges are also subject to anchoring bias: research has shown that their mandated sentences are directly influenced, even if subconsciously, by the sentences initially proposed by prosecutors.

Key Academic Research:

  • Furnham, A., & Boo, H. C. (2011). A literature review of the anchoring effect. The Journal of Socio-Economics, 40(1), 35–42. https://doi.org/10.1016/j.socec.2010.10.008
  • Simonson, I., & Drolet, A.L. (2003). Anchoring Effects on Consumers’ Willingness-to-Pay and Willingness-to-Accept. Journal of Consumer Research, 31, 681-690.
  • Enough, B., & Mussweiler, T. (2001). Sentencing under uncertainty: Anchoring effects in the courtroom. Journal of Applied Social Psychology, 31(7), 1535-1551. https://doi.org/10.1111/j.1559-1816.2001.tb02687.x

Decoy Effect

Explanation: The decoy effect is a behavioral principle often used by marketers to influence consumer choice. Consumers have a tendency to update their choices based on the presence of a third less attractive (or more expensive) “decoy” option. The decoy option is typically partially inferior to one option but completely inferior, on a relative basis, to the target option. The decoy effect often prompts us to select an option which is more expensive or comprehensive than what we otherwise might have chosen if presented with only two options.

Example: For example, imagine you are in a coffee shop. A small coffee costs $2.50. A large coffee costs $5.00. In this scenario, the large coffee seems expensive (twice as much as the small!). However, if the coffee shop was to add a medium option for $4.00, the large suddenly seems like a good deal (only one dollar more than the medium!). 

Key Academic Research:

  • Simonson, I. (1989). Choice based on reasons: The case of attraction and compromise effects. Journal of Consumer Research, 16(2), 158. https://doi.org/10.1086/209205
  • Ariely, D., & Wallsten, T. S. (1995). Seeking subjective dominance in multidimensional space: An exploration of the asymmetric dominance effect. Organizational Behavior and Human Decision Processes, 63(3), 223–232. https://doi.org/10.1006/obhd.1995.1075
  • Bateman, I.J., Munro, A., & Poe, G.L. (2008). Decoy Effects in Choice Experiments and Contingent Valuation: Asymmetric Dominance. Land Economics, 84, 115 – 127. https://doi.org/10.3368/le.84.1.115
  • Sedikides, C., Ariely, D., & Olsen, N. (1999). Contextual and procedural determinants of partner selection: Of asymmetric dominance and prominence. Social Cognition, 17(2), 118–139. https://doi.org/10.1521/soco.1999.17.2.118

Norms and Influence

Social Norms

Explanation: One of the primary ways that we learn what an appropriate behavior is within a given situation is by observing others. Social norms, the unwritten rules and beliefs about what behaviors are common and acceptable within society, are a key driver of human behavior. We look to what others are doing to figure out the right way to act in a given situation. We are motivated to behave in similar ways to others, in an attempt to match the right behavior to the situation. Social norms may be either descriptive (what we believe others are doing) or prescriptive (what we believe others approve of). Our behaviours can be influenced by social proof, or demonstrations of descriptive or prescriptive norms.

Example: Examples of social norms influencing behavior are everywhere, from what we wear to how we engage on social media. Research has shown that people are more likely to litter when they see others doing so, and are more likely to reuse their towels in their hotel rooms when they are told that others like them are likely to do so. 

Key Academic Research:

  • Cialdini, R.B., Kallgren, C.A., & Reno, R.R. (1991). A Focus Theory of Normative Conduct: A Theoretical Refinement and Reevaluation of the Role of Norms in Human Behavior. Advances in Experimental Social Psychology, 24, 201-234. https://doi.org/10.1016/S0065-2601(08)60330-5
  • Hallsworth, M., List, J.A., Metcalfe, R.D., & Vlaev, I. (2014). The Behavioralist as Tax Collector: Using Natural Field Experiments to Enhance Tax Compliance. NBER Working Paper Series. https://doi.org/10.1016/j.jpubeco.2017.02.003
  • Johnstone, L., & Lindh, C.T. (2018). The sustainability‐age dilemma: A theory of (un)planned behaviour via influencers. Journal of Consumer Behaviour, 17. https://doi.org/10.1002/CB.1693
  • Torgler, B., Frey, B.S., & Wilson, C. (2009). Environmental and Pro-Social Norms: Evidence on Littering. The B.E. Journal of Economic Analysis & Policy, 9. https://doi.org/10.2202/1935-1682.1929

 


If you’d like to learn more about these biases and how they might affect your work, check out our Behavioral Economics Bootcamp.

EXPLORE MORE

Our Services

From concept to code, explore how we get our hands dirty with research, product, and marketing challenges.

Our Areas of Expertise

Learn how we are helping change behaviors across the domains of health, education, finance, and more.

Join our Bootcamp

Understand your customers’ choices and learn how to change their behavior for the better — in our 8-week online Behavioral Design course.