The Interactions of Heuristics and Biases in the Making of Decisions

Alice Newkirk

Given the sheer number of decisions the average person makes on any given day, the brain's use of shortcuts to help assess different choices makes perfect sense. It would be a waste of time and energy if someone had to do an exhaustive cost-benefit analysis to decide which brand of laundry detergent to buy, or which kind of pizza to order. As a result, people use a number of mental shortcuts, or heuristics, to help make decisions, which provide general rules of thumb for decision making  (Tversky & Kahneman, 1982). However, the same glossing over of factors that makes heuristics a convenient and quick solution for many smaller issues means that they actually hinder the making of decisions about more complicated issues (Tversky & Kahneman, 1982). Heuristics are simplifications, and while simplifications use fewer cognitive resources, they also, well, simplify. Furthermore, since people mostly use these shortcuts automatically, they can also preempt analytical thinking in situations where a more logical process might yield better results. Although heuristics are useful shortcuts for everyday judgment calls, they can lead people to make hasty, sometimes incorrect decisions about issues that are more complicated.

An excellent case study for the flaws and complications of heuristics is the hypothetical case of Audrey, a hypochondriac whose vitamin-taking regimen is challenged by a new study linking vitamins with increased risk of death. Audrey attributes her good health to her vitamins, and her decision making process is further complicated by the advice of her friend, who tells her that the study is worthless and she should ignore it completely. Whether or not Audrey later goes through a more thorough reasoning process, her initial judgment will be highly influenced by common decision making heuristics. Audrey’s case is an excellent lens through which to look at common heuristics and the problems they create because her hypochondria makes her perceive her decision as having potentially dire consequence; she has a strong emotional investment in the decision, which has the potential to override her reasoning self. Although her situation is unique, the way she uses heuristics will follow common patterns of thinking. In Audrey's case, heuristics will lead her to believe that vitamins can only either be completely toxic or utterly harmless; her emotional attachment to her vitamins will give her a strong bias in favor of the second conclusion, and as a result she will reject the study entirely. This extreme reaction will highlight common heuristics and biases in an extreme way.

From the start, Audrey will be looking at her vitamin dilemma through the lens of her emotions. The affect heuristic suggests that strong emotional reactions often take the place of more careful reasoning (Sunstein, 2002), and Audrey has plenty of reason to have strong emotional reactions. Hypochondria is a mental illness centered around an irrational fear of serious disease, and hypochondriacs are obsessed with staying healthy as a result of this fear (Medline, 2012). As a result, by challenging Audrey's beliefs, the study presents her with massive emotional turmoil. Her vitamin regime, which provides her with a way to control her irrational fear of illness, is being called into question, and as a result her fear and anxiety levels are likely to be even greater than usual. Both giving up and continuing to take her vitamins are choices with massive emotional weight: giving up her vitamins means giving up a source of security, and continuing to take them means possibly continuing to expose herself to future harm. 

Audrey's emotional complications will be further exacerbated by a whole category of mental shortcuts known as intuitive toxicology. Intuitive toxicology governs the ways people think about chemicals, compounds and toxins, and includes the false notion that chemical compounds are either entirely dangerous or entirely safe: in other words, that there is no such thing as moderately dangerous or dangerous only in excess (Sunstein, 2002). While not technically heuristics, these simplifications often erase the complexity associated with carcinogens and chemical health risks (Sunstein, 2002). By falling prey to the all-or-nothing model of risk, Audrey will not be able to think of the risk presented by the vitamins as a slight increase in the statistical probability of death. In her mind, her vitamins will either be completely harmless or dangerously toxic.

Furthermore, other effects of the affect heuristic will increase the stakes, and her emotional investment, even more. The affect heuristic links the perception of risks and the perception of benefits: when people perceive something to be high risk they perceive it to be low benefit, and vice versa (Sunstein, 2002). People have trouble believing that something is simultaneously risky and beneficial, especially where the risks are perceived to be very high (Sunstein, 2002). So as a result of the affect heuristic, if Audrey thinks that her vitamins are high risk, she will also think that they are low benefit. For Audrey, choosing to give up her vitamins as a result of the study would not only be admitting that she has been doing something actively harmful, but also that the regime on which she based her good health and safety had no benefits at all.

These high emotional stakes will give Audrey a bias in terms of what she wants to be true, even if her emotions play no further part in her reasoning process: accepting the study as true would mean that her main source of safety and support was extremely dangerous and not beneficial through the lenses of the all-or-nothing and affect heuristic biases. As a result, she will be motivated to show that the study is completely wrong. Her emotional investment in this hypothesis will lead to a number of other biases which will further affect her reasoning process, especially since she already strongly believes vitamins are healthy. Most notably, she will be subject to the belief-bias effect and confirmation bias.

The belief-bias effect, the first of these biases, has two parts: when a conclusion is unbelievable, it is much harder for people to accept, even when the logic is sound; and when a conclusion is believable people are much less likely to question its logic (Evans & Feeney, 2004). There are two potential explanations for these effects, both with implications for Audrey's decision making process. The first, the Selective Scrutiny Model, suggests that people are more likely to think critically about evidence when presented with a conclusion they disagree with (Evans & Feeney, 2004). In Audrey's case, she is more likely to be skeptical about the evidence provided by the study because she disagrees with its findings. The second, the Misinterpreted Necessity Model, suggests that people rely on prior beliefs to guide their judgments when the evidence is unclear (Evans & Feeney, 2004). This model has clear applications to Audrey's situation: when presented with the conflicting evidence provided by her friend and by the study, she is likely to rely on her previous belief to make her choice, i.e. that vitamins are healthy and harmless. Both of these models will lead Audrey to be far more skeptical of the studies findings, and far more accepting of evidence supporting her original beliefs.

Not only will Audrey be far more accepting of evidence supporting her preferred hypothesis, she will actively seek out evidence, as suggested by confirmation bias, that validates her beliefs. Confirmation bias leads to people seeking out information that confirms their hypotheses instead of refuting it (Evans & Feeney, 2004). Once Audrey has decided on a hypothesis—in this case, the one suggested by her previous beliefs and emotional reaction—she will look for pieces of evidence that support it, instead of searching for conflicting evidence and revising her theory based on that. As a result of the belief bias effect and confirmation bias, Audrey will actively search for information that supports her belief in vitamins, accept it more easily than she would other information and scrutinize conflicting evidence more aggressively.

Audrey will be able to find plenty of support for her hypothesis through other heuristics and biases. A variety of heuristics and biases can take the place of empirical evidence in decision making (Tversky & Kahneman, 1982); These heuristics, and their resulting biases, will provide Audrey with 'evidence' in favor of her all-natural vitamin regime. This evidence might not stand up to critical, unbiased analysis, but since she is looking for evidence that confirms her hypothesis and not scrutinizing confirming evidence too carefully as a result of belief bias and confirmation bias, her shortcuts will have a strong effect on her decision making. The first of these biases is another facet of intuitive toxicology. A number of specific biases come into play when people think about chemical risks, and one of these is the bias concerning the benevolence of nature (Sunstein, 2002). The chemicals produced in nature are not inherently safer than manufactured ones- for example, arsenic is a natural chemical, and is definitely not harmless. But as a rule of thumb, people tend to instinctively assume that natural compounds are somehow healthier and more benevolent than compounds which are man-made (Sunstein, 2002). This has clear implications for Audrey's all-natural vitamin regimen: since nature is fundamentally benevolent according to intuitive toxicology, Audrey's natural vitamins cannot be dangerous.

Audrey will find further evidence for her hypothesis through her previous positive experience with her vitamins. The representative heuristic, describes the different ways people often misattribute causes to various effects (Tversky & Kahneman, 1982). (Tversky & Kahneman, 1982). One example of this is the misconception that past experience is a good indicator of future forecasting. Even when present experience has little to no bearing on what someone is trying to predict, they are likely to try to use their present evidence to support their hypotheses for the future (Tversky & Kahneman, 1982). In Audrey's case, she will base her expectations of her vitamins off of her past experience with them, whether or not the two things are at all connected or if the effects of vitamins are supposed to be instantaneous. Since she attributes her good health to them, she presumably thinks of them very positively. Furthermore, the affect heuristic applies here as well; in this case, instead of high risks being associated with low benefits, high benefits are associated with low risk. Because she has previously seen vitamins as being extremely beneficial, she will also see them as having previously been low risk. She will use this as confirming evidence that the study is wrong: because she has in the past experienced only the positive effects of vitamins, she will assume that vitamins only have positive effects.

Audrey's confidence in her vitamins will be further strengthened by her conversation with her friend, who provides direct evidence to confirm her hypothesis. Audrey will be subject to the effects of group polarization: when multiple people of similar beliefs talk about something they share an opinion on, the opinion of the entire group is likely to shift further to the extreme, since people both have their beliefs confirmed and may be exposed to the beliefs of more radical people (Sunstein, 2002). Audrey is already motivated to prove the study wrong, already believes in the healthiness of vitamins and already has 'evidence' supporting these claims as a result of intuitive toxicology and the representative heuristic; her friend's rejection of the study will support her beliefs and polarize them even further.  As a result, Audrey is likely to have her beliefs about vitamins confirmed and strengthened, and feel confident rejecting the results of the study completely.

Her previous positive associations with vitamins will help mitigate some of the potential negative effects of heuristics as well. Specifically, she will be less susceptible to alarmist bias, increased fear and urgency surrounding alarmingly vivid threats (Sunstein, 2002). Although the 'risk of death' mentioned by the study sounds very dangerous, it is also extremely vague. Death by vitamin does not have the urgency or vivid imagery of a plane crash or a terrorist attack.  The threat of death will also be lessened by the availability heuristic, a mental shortcut for estimating the size or probability of something with how many examples come to mind—for example, estimating the number of five letter words ending in -ing by thinking of a few examples (Tversky & Kahneman, 1982). Audrey will not be able to think of examples of people who have died by vitamin overdose because that sort of thing doesn't make the news and is not particularly graphic, so her estimation of the threat will be severely diminished. Conversely, she will be able to think of a great many positive instances associated with vitamins, since she has used them for a long time and attributes her good health to them. As a result, she is likely to underestimate the severity of the negative consequences of her vitamin regime and overestimate their positive effects. The fear and anxiety brought up by these heuristics will be mitigated, and these heuristics will therefore have a much smaller effect on her reasoning process.

One of the other biases of intuitive toxicology also seems to work against Audrey's hypothesis. Laypeople often assume that it is possible and desirable for a chemical to have absolutely no associated risk, which trained toxicologists know to be untrue (Sunstein, 2002). At first, this seems to be a strike against Audrey's vitamins. They cannot be healthy or worthwhile if they have any associated risk at all, and the study suggests that they do.  However, this fallacy's interactions with a number of other biases negates its effect. First, since Audrey is more critical of things she finds unbelievable as a result of the belief-bias effect, she is more likely to subject the zero-risk fallacy to critical examination. As a result, she is more likely to think logically about it and dismiss it as illogical than she is any of her other assumptions. Second, if she does not examine it critically, its interaction with the all-or-nothing fallacy will actually strengthen her notions about the safety of her vitamins. If her vitamins have associated risk, then by the all-or-nothing fallacy they must be dangerously toxic, a hypothesis which she is eager to reject. On the other hand, if they are completely healthy, the other option presented by the all-or-nothing fallacy, then they must have no risk associated, because the zero risk fallacy suggests that no risk is optimal and attainable for compounds. The zero-risk fallacy initially seems to counter Audrey's theories about risk, but as a result of her emotional investment combined with the biases driving her reasoning process, it will actually strengthen her argument.

Audrey's emotional reaction to the information presented by the study will dominate her initial thought process, and will guide her reasoning along with a number of general heuristics. Her mental polarization of the dilemma and her emotional investment in proving her original beliefs correct will lead her to instinctively reject the study in its entirety. However, her reasoning process does not have to end there, should she so choose. Heuristics are fundamentally shortcuts for reasoning, and people are perfectly capable of taking the long route to reach a better result. But whether or not Audrey decides to analyze the potential effects of her vitamins more critically, her beliefs and biases will play a role in the ways she initially thinks about her situation. Audrey's particular biases may be exacerbated by her intense situation, but they are the analogues of biases common to everyone. While our instincts can provide easy guidance in simple decisions where they accurately represent what's actually going on, in multifaceted issues like Audrey's vitamin dilemma, they can often lead us astray. By knowing when these heuristics may be working against us rather than for us, we can choose when to engage in deeper critical thinking and learn to overcome our own biases.

Bibliography 

Evans, J. & Feeney, A. (2004). The role of prior belief in reasoning. In J.P. Leighton & R.J. Sternberg (eds.) The nature of reasoning. (pp.78-102). Cambridge, UK: Cambridge University Press.

Sunstein, C. R. (2002). Risk and reason: Safety, law, and the environment. Cambridge, UK: Cambridge University Press. Ch 2: Thinking About Risks, (pp. 28-58)

Tversky, A. & Kahneman, D. (1982). Judgment under uncertainty: Heuristics and biases. In D. Kahenman, P. Slovic, & A. Tversky (Eds.) Judgment under uncertainty: Heuristics and biases. (pp 3-20). Cambridge, UK: Cambridge University Press.