Heuristics, Fallacies and Biases ©

How our decision making in dispute resolution is affected by human nature.

By Dave Finch

 

Abstract: A discussion of a number of important heuristics which often come into play during decision making and may pose obstacles to dispute resolution through negotiation, the effects of which can be mitigated with the aid of a discerning mediator. Framing, anchoring, availability, small number fallacy and loss aversion are discussed in detail.

 

A great deal of observation, testing and study by psychologists and others has given persuasive support to the proposition that as a matter of human nature, people assume for themselves high levels of predictive ability, control over future events and certainty of their conclusions and that these often unrealistic notions may powerfully influence the judgments they make. Instead of succeeding in the employment of rational analysis to solve a problem, there are times when in fact we are doing no such thing but are in thrall to our heuristics, including biases and fallacies, any of which can lead us in errant directions, especially when placed in the service of an emotionally charged contention.

 

Heuristics or rules of thumb, “educated guesses”, or judgments made based on intuitive notions, are often based on previous and validating experience, but in some cases are a natural part of our evolved way of thinking. Heuristics are often at work when we refer to something as common sense. Often helpful, heuristics can give us tactics or tools by which to make quick or efficient use of information. In circumstances in which we lack subject matter expertise or in which motivations and interests are complex, competing or confused, as in the decision making necessary to the processes of evaluation, risk assessment and dispute resolution, they can and often do get in the way of sound judgment. Fallacies and biases may also be part of our heuristic arsenal and almost by their very nature are likely to be at work at some less than conscious level where they escape our notice.

Take for example what is generally referred to as the representative bias. Consider the following:

 

Alison studied archeology at UC Berkeley and excelled. While there she was active on campus in demonstrations and marches supporting civil rights and social justice. Which of the following two statements about Alison do you find most likely to be true.

1. She is a librarian.

2. She is a librarian and usually votes Democratic.

 

In similar experiments repeated many times over, most people chose the second option though it is wrong. It is wrong because it involves the less probable concurrence of two features. Option one is more likely because it does not require a co-incident factor. Various experiments have borne out the representative bias, meaning that we are more influenced by context and the observation that a particular description is representative of the information derived from that context, than we are by the purely rational acceptance of probability. You may have thought of Alison as a young woman, representative of University students. You may have thought her to have leanings toward the policy preferences that characterize Democrats, representative of the causes she advocated. But, notice, you were given no information of her age or life experiences since leaving college. So the second choice is faulty for lack of information and its requirement that two facts coincide, as against no coincidence required by the first choice.

 

One of the advantages of the mediation process in dispute resolution is that the practice of patience, encouraged by the mediator, fosters a better chance for these error producing influences to be overcome, either through recognition of them for what they are, or the assimilation of essential information that was previously either unknown, misinterpreted or largely ignored.

 

FRAMING EFFECT

 

The consideration of framing as an influence in judgment making is a mainstay of mediation training and so is familiar to most mediators. Framing is what we do when we pose a problem or situation in a particular way. If we pose an issue in a negative way, the “frame” will have an effect that differs from what occurs when the situation is framed in a positive way, and the results in decision making from the way a situation is framed can be substantial. Consider the following:

 

You are a contagious disease expert at the U.S. Centers for Disease Control and you have been told that the United States is preparing for the outbreak of an unusual Asian disease that is expected to kill 600 people if nothing is done. Your team of exerts has presented you with two programs which they confidently predict will have the following results in combating the disease:

Program A: 200 people will be saved.

Program B: There is a one-third probability that 600 people will be saved, and a two-thirds probability that no people will be saved.

 

Seventy two percent of the subjects in the experiment with this scenario chose Program A. But, now consider another set of choices for the same scenario.

 

Program C: 400 people will die.

Program D: There is a one-third probability that nobody will die, and a two-thirds probability that 600 people will die.

 

On a close look, you will see that the two sets of programs are identical in practical effect. Yet, when the A-B choices were re-structured in the C and D choices, seventy eight percent chose not C, but D! What do A and D have in common to produce this result? Both are positively framed in terms of saving people. Program B and C outcomes are negatively framed, i.e., in terms of a lot of people dying. The brain tends to recoil from the negative picture.[1]

 

Here is another example of how framing works:

 

You are in a store that offers a computer that meets your needs for $600. Your friend reminds you that down the street, about a 10 minute walk away, you can buy the same computer for $525. Do you make the short trip to save $75?

You are in a store that offers a wide screen TV that meets your needs for $4,200. Your friend reminds you that down the street, about a 10 minute walk away, you can buy the same TV for $4,125. Do you make the short trip to save $75.

 

In similar studies most said they would make the trip in the first instance, but not in the second. The psychologists explain that this is because of our tendency to engage in a kind of framing called “mental accounting”. Mental accounting operates in various ways, but in this instance, it accounts for the magnitude of the “savings” not in relation to what is in our pocketbook, but in relation to the size of the transaction. In the above example, the frame difference is between $75 being a large saving in relation to $600, but an insufficient motivator to walk ten minutes in relation to $4,200.

 

Homemakers notoriously drive across town at considerable transportation expense to buy groceries at a lower price without considering that the savings are likely to be less than the cost to achieve them. There the person’s focus seems to be on how much will be spent on groceries in the context of a relatively small amount of money on hand. Small savings seem larger than they would if considered in a larger context that included transportation costs.

The mediator is often able to re-frame an issue so as to deflect the effect of a negative frame. The party is allowed time to describe the problem as he sees it, while the mediator listens in a focused, patient, and participatory way. The mediator then restates what was said by way of seeking confirmation that the party’s statement was understood. In the restatement, the mediator will often change the framing from negative to positive, or vice versa, in an attempt to dislodge the party’s mind set. Here’s an example:

 

Party A has a claim against party B and there is a strong chance that if party A were to sue on the claim, party B would have to pay more than $10,000. Party B feels he has a good defense even though he realizes there is a risk he will have to pay the claim. He argues that $10K is a lot of money; too much to pay to settle this claim. He is likely to say “I can’t afford that.” B in fact has a net worth of $500,000 and a steady income. The mediator decides to try reframing and does so as follows: “B, while it is in your interest to settle this matter and avoid the risks and expenses of a lawsuit, you also have an interest in maintaining your financial condition, is that right. And as you see it, $10,000 would take away too much of your savings and worth to justify the insurance against the risks and expenses of litigation. While B may nod in agreement with the reframe, he is now thinking about his overall financial condition, and not just about $10K in relation to A’s claim. The mediator will likely have other opportunities to deflect B’s attention from the basis of the claim and direct it toward the question of whether paying $10K is a substantial consideration in relation to his overall financial ability. In time B will consider whether the expenses and risks of litigation are more threatening to his financial wellbeing than a mere $10K payout. Once his frame is expanded, B can more rationally assess the settlement options.

ANCHORING EFFECT

 

A well studied heuristic is the brain’s tendency to latch on to a known quantity so that subsequent deliberations have a reference point. This strong tendency was illustrated in the following experiment:

 

Subjects were asked to state the last four digits of their social security numbers. They were then asked to estimate the number of physicians in New York City. Those with higher SS numbers consistently gave higher estimates of the number of physicians.

 

In another experiment, Group 1 was asked to estimate whether the percentage of African nations that are members of the U.N. is more or less than 45 percent. An exact number was required. Group 2 were asked the same question, but with the modification that the number 65 was used instead of 45. Group 1, answering the first question with the number 45 reference point gave lower estimates than those of the Group 2, whose reference point was 65.

 

In both experiments completely irrelevant numbers were instrumental in the thinking of the test subjects as reference points that steered the subjects’ estimating to a different level than would have been the case absent such numbers. It is unlikely the test subjects were even aware that they were so influenced.

 

The mediator, alert to anchoring effects, can watch for situations in which one or both parties are estimating a quantity (time, distance, dollars, etc.) in the context of an anchor that has no real relevance. For example, in a neck strain case, the claimant has a chiropractor bill of $5,000 and is now well. Her demand to settle is $15,000, because she has always heard that insurance companies use 3 times the medical expense to determine the value of a claim. Any figure that is substantially smaller that $15,000 will seem to her to be unfair. The mediator might ask the claimant to think about the reliability of her sources of this information and whether she has any data to support this supposition. The mediator might also ask “what if you were to learn that statistically in your case the juries in this jurisdiction have been awarding medical expenses plus a fixed sum for the pain and suffering damages. Would the $15,000 figure still necessarily represent to you the amount needed for a fair deal?Time permitting the mediator might also assist in finding a way for her get more reliable information.

 

The defendant in this case will have much to say also, and will point out that statistics show that juries are reluctant to award much for pain in cases of this type, that juries are unsure of the reliability of the claim, and that they will typically limit the pain damages to a small add on figure such as a few hundred dollars. Offering $6,000 to settle sets an additional anchor that causes plaintiff’s anchor to begin to wobble. This case will settle in the range of $7,500 to $12,500.

 

Anchoring works the other way too. A claimant can set a high anchor so long as the demand is accompanied with information suggesting a rational as opposed to a rumored basis for the figure.

 

AVAILABILITY FALLACY

 

In the example of the neck strain claimant above, her idea that multiplying her medical expense by three automatically yields the value of her claim, came to her from what she had heard others say. She had not stopped to evaluate the reliability of her sources for this information. Judgments and decisions are often powerfully influenced by information known to us that may be irrelevant, or untrustworthy, but, is readily available in the sense that it occurs to us promptly in relation to the issue under consideration. Studies have shown that humans give more weight to information that comes easily to mind, i.e., that is “available”, to use the term of the psychologists.

Before the 1976 elections, one group was asked to imagine Gerald Ford winning, while a second group was asked to imagine Jimmy Carter winning. The subjects were then all asked to estimate the chances of winning of each of the candidates. Those who had imagined Ford winning estimated his chances the highest. Those who had imagined Carter winning estimated his chances the highest. The earlier imagined outcome was irrelevant to the process of making estimates, yet influenced the estimates made.[2] Availability is similar to anchoring and the two phenomena probably overlap to some extent.

The heuristic of availability frequently relates to problems of negotiation in dispute resolution. Having imagined a favorable outcome should the dispute materialize into a court battle, and likely having focused greater attention on all the “reasons” why that favorable outcome should occur, the disputant will persistently overestimate his chances of achieving such an outcome. Every time he thinks about the alternatives to settlement, the most available information is the imagery in his own mind, though that imagery may be based upon an assessment of the data that is not at all objective. The mediator can help to mitigate this availability fallacy, by asking the disputant to imagine (the unthinkable) an entirely negative outcome, and whether such a disaster is within the realm of possibility. People usually respond: “I guess anything is possible.” In order to fortify this alternative image the mediator asks about facts which might be used by the disputant’s opponent to bring about that disaster. By understanding the availability fallacy and how it works, the mediator is better able to pose those questions which will help the disputant avoid its influence.

Perhaps the acute availability of certain information obscures a wider perspective especially when shared among several people. This may partially explain the erratic behavior of casino gamblers.[3]

 

THE LAW OF SMALL NUMBERS

 

This heuristic might better be called the “small number extrapolation fallacy”, because it refers to our tendency to rely on inadequate data to extrapolate or estimate by extension toward a conclusion that seems to be supported by the data. Here’s an illustration:

 

You are told that in the two bags before you one contains marbles 2/3rds of which are red and 1/3rd white; and that in the other there is an equal number of marbles in the opposite proportion of white to red. You are to decide which bag has the most red marbles and which the most white. You are allowed to extract 5 marbles from bag one, and thirty marbles from bag 2. You make your draw from bag one and four of the five are red. Out of bag two you extract twenty reds and ten whites. Which bag has the most reds and which the most whites?

 

If you surmised that bag one has the greater number of red marbles you are in the majority, but wrong. The likely reason you chose bag one is that 80% of the random draw of marbles were red, versus only 67% in the other bag, and you reasoned that this signified a greater number of reds than whites in bag one. The problem with this analysis is that your five marble sample was much smaller than your thirty marble sample. Statistically it is more likely that bag two had more red marbles.

This tendency to make predictions and decisions based on inadequate data extrapolation is a common obstacle in dispute negotiation and mediation. A plaintiff who suffered a whiplash though given a prognosis of complete recovery in a few months, based upon the large numbers of accident victims who recover at that rate, but whose friend had a similar injury two years before and continues to suffer, is likely to resist the prognosis, even despite obvious differences between her age and anatomy and that of her friend. Insurance companies negotiate such claims with the confidence borne of vastly larger amounts of data and statistically are more often right in their prediction of jury verdicts than plaintiffs’ lawyers, because the latter are more apt to rely on their past individual experiences which are comparatively fewer. This problem can sometimes be overcome by extensive search of the “sheets” that provide data on similar cases. However, differences between cases are not easily discerned from available data. Thus, limited data that favor one’s preference, say a plaintiff’s preference for a high award are highly influential, thanks to the humanly natural confirmation bias, i.e., our well known tendency to overemphasize any data that confirms the answer we hope for.

The mediator aware of this small numbers fallacy will diplomatically inquire of the party relying on limited data whether she believes it is sufficient in quantity to match the data on the opposing side, not by way of coercing her to accede to the opposition proposal, but by way of inviting her to reassess a position that may be based on inadequate data. In most cases there is a region in which no one can say with certainty that a particular number is too high or too low; above or below a probable verdict; unrealistic or spot on. But, the mediator can guide both parties toward that zone of possibilities where both sides are likely to take refuge in settlement.

 

LOSS AVERSION

Another heuristic that bedevils conflict negotiation is the loss aversion bias tested and documented by cognitive psychologists, Kahneman and Tversky.[4]

Their testing showed that most of us weight the undesirability of loss about twice that we assign to the desirability of gain, and that people tend to seek risk in order to avoid a certain or highly probable loss. The consequence of this human tendency in any dispute negotiation process, including mediation, is that faced with a demand for the payment of money or giving up of a thing, the respondent will tend to look for ways to avoid meeting the demand and will consider risky behaviors such as proceeding into a court trial. Other later research has suggested that a powerful tendency to prefer the status quo is a sufficient explanation for the loss aversion phenomenon. Whether we call it loss aversion or status quo preference there is an element of human nature that will seek risk in order to avoid a negotiated compromise.

The problem of loss aversion may grow larger as the dispute wears on especially if legal and other costs are being incurred in the process. The “sue me” response to a demand, may be the snowball that grows as it rolls downhill. In a California case the Court of Appeal commented on the fact that hundreds of thousands of dollars were expended in legal fees and costs in a case where the difference in positions just before suit at the time mediation was offered and declined by one of the parties, was a mere $18,000. And because it was the prevailing party who had declined mediation, by the express terms of the contract sued upon, that party was ineligible for an award of attorney fees and so lost many multiples of the $18,000 difference in positions.[5][5]

The above case is an illustration of the “sunk cost” aggravation of loss aversion. Having spent thousands of dollars on the litigation processes, each party becomes increasingly reluctant to consider meeting the demands of the other. Both continue to drive the costs up and up, making settlement possibilities more and more unattractive. Each party here is demonstrating a preference for the risks that lie ahead over the certain loss, now grown larger than before, that will be realized by ending the process.

 

There are other heuristics deserving the attention of negotiators and mediators. There is a great deal of available literature from which a more detailed understanding can be acquired. Here are some of them:

 

. Reactive devaluation, the tendency to react negatively to another’s proposal assuming that it must be in his better interest and against the best interest of the one to whom it is proposed.

Risk Aversion, a dislike of uncertainty, but a heuristic that will give way more often than not in the face of certain loss.

Cognitive dissonance, a mental state in which a person “believes” two opposing or inconsistent ideas, or states a belief that is belied by apparent realities.

Overconfidence. This seems to be a pervasive human trait and is thought to account for the fact that in repeat surveys, 90% of Swedish people believed they were better than average drivers.

 

Through experience and observation these human tendencies are discerned through active listening by mediators and negotiators and can be neutralized when the influenced party’s thoughts are redirected toward more reliable sources of information.

 

All rights reserved. © 2008

 


 

 


[1][1] Michael Shermer, “Mind of the Market”, Henry Holt, 2008 in which the author cites

[2][2] Ibid. p 78

[3][3] An excerpt from Darrel Huff’s book “How to take a chance”: Huff tells the story of a run on black in a Monte Carlo casino in 1913: ..black came up a record twenty-six times in succession. Except for the question of the house limit, if a player had made a one-louis ($4) bet when the run started and pyramided for precisely the length of the run on black, he could have taken away 268 million dollars. What actually happened was a near-panicky rush to bet on red, beginning about the time black had come up a phenomenal fifteen times…players doubled and tripled their stakes (believing) that there was not a chance in a million of another repeat. In the end the unusual run enriched the Casino by some millions of francs.

[4][4] Kahnemen/Tversky, “Prospect Theory: An Analysis of Decision Making Under Risk”, Econometrica V.47.

[5][5] Frei vs. Davies, ( 2004) 124 Cal.App. 4th 1506.