Thinking About Thinking: Implicit Biases in Litigation

The Bencher—March/April 2019

By Kelly Canavan, Esquire

Last year, my Inn of Court team presented on how cognitive biases affect so many elements of legal work. But like with so many interesting presentations, we did not get to dive as deeply as I wished into several aspects. Specifically, I wish we could have engaged in a longer discussion of how cognitive biases present challenges in the courtroom and along the litigation road. Hopefully we whetted a few appetites for further research. And I will try to whet some more. Based on the most current work in cognitive psychology, I think small shifts in how we conceptualize decision-making may make big differences in litigation tactics.

What we know about ‘knowing’

Most people conceptualize decision-making as a process in which information determines a result. That is, we expect that by giving someone the proper and necessary information that person will come to the correct conclusion. Under this so-called “information deficit model,” we believe that people make wrong decisions because they lack enough correct information or have misperceptions about reality.
In the world of litigation, the information-deficit model often shows up in evidence issues. Attorneys work to get everything into the record that supports their position. Because if the decision-makers have all the evidence in front of them, the right answer will be clear. And when they make the wrong decision, we attorneys question what document, picture, or video would have changed the result.

But this probably is not how people actually make decisions. Many people who we might consider “misinformed” are likely to have already encountered correct information. Instead of reevaluating their position on the topic, however, they simply dismiss the new information. This is because people frequently use mental shortcuts to make decisions quickly and with as little processing effort as possible. This is the theory of “low information rationality.”

Under the theory of low-information rationality, fact-finders rely heavily on heuristics, which are shortcuts to save mental energy. Heuristics include things such as first impressions, “gut instincts,” and tribal signaling. Tribal signaling has shown up in the news a lot lately. Apparent offhand comments (or memes, or reposting from certain news sources) signal allegiance to one group or another by using in-group lingo.

We have heard of a lot of these heuristics: We know to play to people’s risk aversion. We anchor a damage award to artificially inflate or deflate a reasonable value. We object to evidence of prior bad acts for fear that they will appear representative of our client’s general character.

Naive realism—the idea that people will interpret the same facts in the same way you do—is difficult to overcome. Not only is it difficult to comprehend why someone cannot see what you see, when it is so clear to you, but discussion and debate can make people dig in their heels. In other words, our beliefs are driven by subconscious psychology but justified by salient information. And misperceptions, unlike mere ignorance, are usually held with a high degree of certainty. People often consider themselves to be well-informed about the fact in question even if they have, consciously or unconsciously, engaged in as little processing as possible.

Through years of research, we know that these heuristics and their resulting biases function in the legal environment just like they do in the rest of the world. We know that jurors tend to make conclusions early in a case. We also know that their final verdicts tend to reflect those early judgments. We know that when questioned people can provide justifications for their positions. But we also know that those justifications are often created post hoc.

I speak here of jurors, but nothing suggests that judges, mediators, or opposing counsel are any less susceptible. Nor are we ourselves immune. Simply being knowledgeable of cognitive biases can increase our confidence in our imperviousness, which, ironically, can make us more vulnerable to their unseen effects. Of course, with repeated exposure and evaluation, lawyers do become better at handling certain biases. For instance, research indicates that attorneys are less likely to fall prey to anchoring biases because we actively defend against anchoring tactics so frequently.

What we might not know

While people do not spend a lot of energy on making decisions, they do spend a lot of time justifying those positions once they have taken them. The reason these first impressions do not change, I suspect, is rooted in the interplay of a number of cognitive biases. Many people have heard of “confirmation bias.” People tend to seek out and better remember information that supports beliefs they already hold. Lesser known is the other side of that coin: People also tend to avoid information they fear would discredit beliefs. Disconfirmation bias results in spending significantly more time and thought actively arguing against opposing arguments. Couple that with “active forgetting” and you have a high mountain to climb to get someone to the other side. Active forgetting differs from passive forgetting, when, by accident, information is not encoded into memory. Active forgetting is purposeful in the sense that the brain makes a calculated decision to delete information, even if we are not conscious of our own decision.

More concerning for those trying to change minds is knowing how people test their theories. Not surprisingly, when people test their theories they tend to seek out tests that provide results consistent with their hypothesis. That is, instead of finding counterexamples, they seek to prove their theories. Finding evidence to support a belief increases the confidence in which someone holds that belief. Further, people tend to interpret weak or inconclusive results as more positive than they are. Of course, the flipside occurs: People tend to discount and undervalue disconfirming evidence. So, it takes more disconfirming evidence to shift someone’s view than it takes confirming evidence to make the person believe he or she is right. And of course, we already know that people actively avoid disconfirming evidence.

In response to receiving contradictory evidence, established beliefs get stronger. Confidence in the correctness of one’s position increases. In short, it is really hard to shake someone out of a belief that he or she has already decided on. People are resistant to change. They are motivated to seek out and remember confirming information and to avoid and ignore disconfirming information. They prefer to see themselves as internally and externally consistent, “steady and true.”

This is not just stubbornness. Computerized scans of the brain done while people are processing information counter to their beliefs show that parts of the brain responsible for defense against physical attacks light up. That means that verbal attacks against closely held beliefs are processed in the brain the same way that people approach a knife-wielding attacker. Seeing just the face of a political opponent generates real feelings of physical threat. The degree to which such defense mechanisms occur depends on several factors, but the personal significance of the challenged belief appears to be crucial. Specifically, beliefs that relate to one’s social identity are likely to be more difficult to change.

What we should do

Trial lawyers tend to view their job as one of convincing a fact-finder of the correctness of their client’s position. But perhaps instead they should see their job as justifying in the mind of the fact-finder a position the fact-finder has already taken.

The good news is that biases do not have to be hinderances. They can be tools we can use to our advantage. Biases are useful predictors of behavior and judgments. It is worth keeping in mind that quick decisions should not be written off simply because they are made hastily. The correct decision is not always, or even frequently, the most reasoned decision. The most reasoned decision is just the most justified decision. And because the justifications and rationalizations come—whether we realize it or not—after the decision has already been made, more time and energy dispelling them probably does not equate with a better result. Strongly held beliefs stem not from facts, but from personal values developed through social norms. Single facts can be easily dismissed when there are other facts Think about what social norms might buttress a fact-finder’s position and play up those norms instead.

When dealing with stereotypes or tribal alliances, be aware that your audience is not processing the argument, they are processing the signal. Weaken their tribal identity, humanize your client (whether it be a person or not), cite sources “friendly” to the audience’s tribe. For instance, conservatives have supported national parks when the issue is framed as one of national pride. Liberals have supported the armed services when reminded of the educational opportunities they provide. This tactic of refocusing the analysis is especially useful when discussing a hot-button issue. Your audience does not have to give up their self-identity to agree with you. Explain how your position is a win for your audience and upholds the status quo.

As I mentioned, simply being made aware of our biases is not always enough to overcome them. It is a good first step, though. From there, we need to encourage ourselves and others to see things from a different point of view. When asked, we can almost always think of a reason why someone would hold a belief different from our own. But we rarely do it without prodding. This is important to keep in mind when arguing against someone who just seems thick-skulled. He is probably not stupid; his brain is just on the defense. But when people are presented with their own idea under the guise that it came from someone else, most people can identify the logical fallacies in the argument. The reasoning skills are there; they just have to be drawn out.

The best way to draw out those skills is to focus on the core facts instead of wasting time rehashing speculations. Provide gap-filling causal arguments that creative a complete and compelling narrative. If your client wasn’t at fault, identify who was—provide the protagonist and antagonist and suggest an ending fitting for those characters. And remind your audience that it’s all right to change their minds.

Being aware of the biases of others—as well as our own biases—makes our predictions more accurate and can aid in managing clients’ biases and their expectations.

Kelly Canavan, Esquire, is a staff attorney at the Supreme Court of Texas. She is a member of the Robert W. Calvert American Inn of Court in Austin, Texas.

© 2019 Kelly Canavan, Esquire. This article was originally published in the March/April 2019 issue of The Bencher, a bi-monthly publication of the American Inns of Court. This article, in full or in part, may not be copied, reprinted, distributed, or stored electronically in any form without the written consent of the American Inns of Court.