Hurry up and slow down
Avoiding errors in your legal judgment
Poor judgment is an enduring source of work for lawyers. Clients, witnesses, lawyers, experts, even judges; no one is immune from mistakes. Because people are limited to interpreting information from their own viewpoint, human decision making and information processing is often biased and faulty. We take “shortcuts” and generate conclusions that are not completely accurate.
We are all conditioned by living in our society. Individuals are incessantly bombarded with information. We do not and cannot take the time to carefully evaluate each piece of information. People need to process information quickly to protect themselves from disadvantage or harm. It is adaptive for humans to rely on instinctive, automatic behavior that keeps them out of harm’s way. We often go with our gut with a bias toward preferences or likes.
We know that our minds use mental shortcuts to manage the complexities of daily life. As we automatically process choices, we are subject to mistakes which may lead to poor decisions. These mental shortcuts (called “heuristics”) follow known patterns or biases. When we act intuitively, immediately, reactively or automatically, we invite bias, which may lead us to diverge from expressed intentions.
Bias can be described as a particular inclination or tendency in one direction. It fosters personal and sometimes unreasoned judgment because it operates as a force field, blocking information from coming in if it does not align with existing biases. It typically operates unconsciously, thereby leaving its influence hidden from our own introspection.
Behavioral psychology categorizes our thinking as intuitive or “fast” and deliberative or “slow.” Fast thinking reflexively accomplishes most of the routine tasks of life automatically, i.e., driving home, identifying threats, recognizing friends. It is an unconscious (intuitive) process we can’t turn off and don’t realize we are doing. Decisions believed to be the result of deliberation often stem from educated guesses, rules of thumb, and pattern recognition. Automatically, effortlessly, and rapidly, fast thinking rushes to judgment with heuristics.
Slow thinking
Slow thinking is deliberate and effortful. It is not automatic. It is used for intentional and mindful analysis of complex problems. Despite this high-level capability, however, the deliberative mind can be lazy and default to shortcuts offered by our intuitive brain. We want answers! Although heuristics are vital to our navigation of daily life and tasks, they can get in the way when a more deliberative, slow-thinking process is required, such as analyzing a client’s legal problems. Because we do not realize when an unconscious bias is acting upon us, we have a blind spot about our own objectivity.
Lawyers are continually processing information in a case, trying to distill complexity into a single narrative thread. We attempt to construct a story from which to make sense of known events and issues. As we do so, even with years of experience, we tend to rely on intuition (fast thinking) and neglect deliberation (slow thinking). So does opposing counsel. We trust our gut when we need a more thoughtful approach.
Lawyers are not always allowed sufficient time to deliberate. Time pressures are correlated with less accurate decisions. Intense emotions are linked to less systematic thinking. People are more likely to make mistakes when acting impulsively. Stress leads us to consider alternatives less methodically. When under duress, these conditions can cause us to rely on intuition to our detriment. Quick decisions, “from the gut” have a demonstrated record of greater error.
Heuristics are a feature of intuition, employed automatically and unconsciously to get us to an answer. It is easier to detect when others have failed to slow down and objectively evaluate than to introspectively identify our own resort to fast thinking. Learning to recognize how and when the intuitive brain might supersede the deliberative process, we will have a better chance to analyze when the client’s imperfect judgment led them astray, and steer them back in the right direction. Identifying a few key heuristics in this article may provide a filter for evaluating our own objectivity, reducing the blind spot that cloaks the unconscious bias at work.
Behavioral psychologist and Nobel laureate, Daniel Kahneman provides a window into the science of decision making and condenses decades of research in his book, Thinking Fast and Slow (2011, Farrar, Straus and Giroux, ISBN 9780374533557). Although human irrationality is Kahneman’s great theme, much of his book focuses on unconscious errors of reasoning which distort our judgment of the world. He objectifies and labels common heuristic shortcuts, which is an effective way to identify everyday intuitive tendencies.
Heuristics and the distortion of judgment
We start any evaluative process with a mental model, a spontaneous impression of complex facts. As a product of intuition, this model is likely less nuanced and more coherent than the intricacy of actual reality. With this framework in place, new facts and evidence are viewed with a preference for achieving unity with the existing template. The initial mental model we form tends to narrow our focus when investigating a topic. Because we have achieved logic in a narrative form we may favor facts and evidence that are congruent with our model.
The natural desire for coherence can merge with the intuitive mental model to suppress divergent details and contradictions. As the intuitive mind constructs a rational story from the limited evidence at hand, our decision-making process may be satisfied with the illusion of simple elegance, overlooking or ignoring the complexity of the issues. Information not fitting within the model is more easily dismissed or explained away. Facts that align are favored and adopted. We confidently conclude the new facts fit our theory or can be safely discarded.
Absent a deliberative effort, the tendency to quickly think with shortcuts and intuition can lead us astray. Here, we will review a few of the more prevalent heuristics and provide examples of many wrong turns these shortcuts present. Perhaps armed with this knowledge, we can make better case (and life) decisions.
Substitution
Faced with a difficult or complex question, the intuitive mind will take shortcuts to avoid a tedious deliberation or a deeper evaluation. An example of this tendency is found in our reliance on uncomplicated narratives to explain the world. We tend to develop stories that have apparent explanatory power, and then overuse those few simple narratives to explain our complex, messy reality.
It is in our nature when confronted by a complex problem to find a simple answer. If we substitute a simple question for a more difficult inquiry, we also substitute a simple reaction in place of a more time-consuming thought process. The substitution bias quickly provides an apparently useful answer to a multifaceted question by answering a different, but much less complex question.
But what do you think about my case?
When asked by a client to predict how or when a case might conclude in the future we tend to substitute the complex problem of considering many possible variables with the straightforward question of “How do I feel about the case right now?” We have a huge bias to extrapolate current trends into the future, because it is a less demanding and relatively effortless question to answer in contrast to the demands of attempting to consider all the factors that can disrupt current trends and vary the case outcome. This dilution of the original problem produces an intuitive but flawed response to the question.
Substituting easy questions for hard ones is just part of a more general tendency toward oversimplification, allowing the intuitive mind to answer quickly. Our brain is fundamentally organized to prioritize efficiency, making quick judgments that are generally accurate. This can result in a tradeoff of more speed but less precision. Reducing problems to manageable analogies is not a bad strategy overall, as long as we don’t confuse our simplistic analogies for reliable facts and can recognize when we’ve taken a misleading shortcut.
The tendency to unconsciously resort to substitutions for quick answers can lead us intuitively to an available memory. It happens when we are asked to make a quick case evaluation. A client needs to know about the probable outcome of a current case, and the availability heuristic favors easily retrieved memory from a recent case or experience to render a forecast. We overestimate the likelihood of events or outcomes based on that which has greater “availability” in memory. Things we remember most clearly, the facts, players and timing from another memorable case or experience, are adopted in lieu of a formal analysis because they are available and easily accessed.
The availability shortcut
An example of the availability shortcut might occur when we are posed the following question: “How common is an eclipse of the sun?” Pausing for a moment, it is obvious that this presents a complex determination. We would have to survey objective astronomical information in order to really answer, and perhaps even consider digging up historical data. This is too much work, so we intuitively substitute a far simpler question: How easy is it for me to think of an example of this phenomenon? If an example readily comes to mind, we conclude the phenomenon is common. If we can’t think of an example, we conclude it is rare.
The availability shortcut is reasonable in many circumstances and is an indicator of familiarity. (I am only able to recall witnessing a few eclipse events in my life. They must be fairly rare!). Analogies often work by this mechanism. When asked a question about a complex case, the availability heuristic finds a parallel to a known prior case, and then answers the question by reference to that case. The problem isn’t that we use analogies or heuristics to inform our decision making. That is perfectly reasonable and often effective. The problem comes from substituting our analogies and heuristics for analytical thinking about the multi-layered question we are confronting.
Failing to recognize when effortlessly available information doesn’t match the current case complexities means that our judgment will sometimes be both wrong and unreliable. The risk of an error in judgment is greater when the task is more complicated and the time allotted to make a decision is limited. If the client asks for a quick assessment, it may be the right moment to hurry up and slow down, explaining to the client that you will call back after a little contemplation.
Overconfidence and the optimism bias
Decision makers must have the ability to reach conclusions on important questions. Arriving at an answer requires confidence, a trait shared by most lawyers. In the process of decision making, overconfidence is a trap. Frequently called upon for rapid and decisive judgments, it is not uncommon for lawyers to fall prey to intuition while favoring coherence with their case narrative.
Unfounded confidence causes us to overestimate the probability of a positive outcome and underestimate the risks of moving forward. Confidence is highly prized and many would rather pretend to be knowledgeable or skilled than risk appearing inadequate. Because intelligence isn’t the same thing as learning and developing a specific skill, smart people can be lulled by their own confidence into believing they are highly competent in a specialty area. When we lack the ability to accurately examine ourselves objectively, we are unable to recognize our lack of competence in a new or complex area.
The tendency to overconfidence may exist because gaining a small amount of knowledge in a field about which one was previously ignorant can make people feel as though they are suddenly virtual experts. Only after continuing to explore a topic will they realize how extensive it is and how much they still have to master. Some individuals may seem highly competent due to their apparent confidence. They are often driven by a desire for status or the need to appear smarter than others around them. By acknowledging uncertainty and remaining open to other views, we are able to resist the overconfident mind. In a profession where precision counts, intuition must acknowledge when deliberation is required.
Consistent with the tendency to overestimate the probability of a positive outcome is the optimism bias which causes an underestimation of the challenge. (“No problem, I’ve got this.”) Overconfidence allows strategic thinkers to overestimate the probability of a positive outcome. If you cannot imagine you might be wrong, you will never force the deliberative mind to raise questions or re-evaluate the foundation of your theory. While an optimistic outlook can be healthy, decisions affecting a case should be founded upon objective facts. As usual, slow thinking wins the day.
The anchoring bias
The anchoring bias occurs with our exposure to an initial piece of information that influences our perception of subsequent information. The initial contact can then affect our decision making and set the tone for how we process information that follows.
When the mental model or theory of the case becomes an anchor, the fast-thinking brain filters later-acquired information within that framework. Anchors are great for strengthening the foundation of the intuitive mental model, but we may neglect to properly evaluate subsequent inconsistencies. Objectivity may erode. When anchored, our intuition may only notice and respond to facts which endorse the pre-existing model.
In a business setting, planning for future development can be led astray by anchoring. Most projects start with a projection of how executives believe the work will be realized. Market research, financial analysis and professional judgment lead to the decision to proceed. Business plans tend to accentuate the positive, making the case for the project, and this can skew later reviews toward overoptimism. Leaders become anchored to original cost estimates and don’t adjust for possible problems or delays. Over-optimistic forecasts have the greatest probability of disappointment.
Anchoring is a known factor in the residential real estate market. We anchor when we allow a number to randomly attract our attention while dealing with money or number-oriented issues. When we try to make estimates or predictions, we usually begin with an initial value or starting point and adjust from there as more facts become available. Anchoring bias limits our recognition of the necessary adjustments that are required, leading to biased results. Our tendency is toward the original anchor.
Data suggests that judges considering a criminal sentence may be anchored by a prosecutor asking for a very long sentence, so that the resulting ruling is much longer than average for the crime. A high demand for settlement in a civil case may tend to establish a higher range of value, even in the mind of the opposing party. Failing to recognize that your fast-thinking brain is out ahead of you may lead to greater susceptibility to anchoring in any situation. By challenging our own assumptions and taking a balanced view, we can avoid cognitive bias.
Confirmation bias
People have an unconscious tendency to process information by looking for and interpreting data that “confirms” their existing beliefs, while setting aside what does not. This intuitive approach to information unintentionally affects our decision making, causing us to marginalize or overlook data that is inconsistent with our preconceptions and opinions.
Confirmation bias has been known since ancient times, and was described by the classical Greek historian Thucydides, in his text, The History of the Peloponnesian War. He wrote: “It is a habit of mankind to entrust to careless hope what they long for and to use sovereign reason to thrust aside what they do not want.”
We are especially likely to filter information to support our own views when an issue is highly important or self-relevant. Once we develop a personal opinion about an issue, we have difficulty managing information in a normal, unbiased manner. Our tendency is to look for facts that support our beliefs. If people are emotionally distant from an issue, they are better able to rationally process and weigh new information, giving equal consideration to multiple viewpoints.
We are susceptible to the bias because it is an efficient way to process information. It allows us to rule out parts of the mountains of information and narrow our focus. People like to feel good about themselves, and discovering that a belief they highly value may be wrong is disconcerting. They want to feel that they are intelligent and information that suggests they have erred reveals they have a blind spot or missed something important. In the absence of detached objectivity, confirmation bias promotes the unwitting dismissal of new facts that contradict their working opinions or model of reality.
Once a judge or individual juror forms an opinion, unconscious confirmation bias may interfere with their ability to process any volume of new, contrary information that emerges during a trial. Evaluating evidence takes time and energy, and the brain looks for shortcuts to make the process more efficient. The juror may selectively avoid all challenging or contradictory information. They may be more likely to remember information that is consistent with opinions they already hold. And this intuitive filtering will take place unconsciously, without intention.
Incorporating conflicting information, and forming new explanations or opinions takes time and effort, and staying on the path of least resistance is often the easy route. Being aware of confirmation bias is a significant hedge against allowing it to occur. If we can understand a potential tendency to give more weight to information that supports our existing beliefs, our objectivity is strengthened. Since it is most likely to occur early in the decision-making process, it is helpful to diversify the sources of information brought to our decision process.
Engaging in debate or asking a colleague to play “devil’s advocate” is an excellent way to reveal flaws in thinking. Searching for information that disconfirms our theory is at the heart and soul of scientific (and legal) research – the exact opposite of the confirmation bias.
The halo effect
Perceptions of merit or worth can carry long-lasting effects. We assume a graduate from a top university will be an excellent employee or a hard worker. Earning that diploma takes a lot of effort and is an impressive achievement. Encountering someone with such credentials may well leave you with a favorable impression, notwithstanding personality quirks or odd behavior. Nearly everything they do will be filtered through the “aura” of their prestigious alma mater. At work, their performance may be viewed as better than it actually is and supervisors may give them higher marks than they deserve in evaluations.
The halo effect can be difficult to counteract. Intuition can associate physical attractiveness or the recommendation of a trusted source with merit. In advertising, the reputation of a particular company may provide a boost to all of its products with consumers. Endorsement by a popular celebrity can launch a product, despite the fact it is no different than others in the marketplace.
If a witness is viewed favorably, we may tend to evaluate everything they say as credible and supportive. The halo effect may lead us to make assumptions and overlook or ignore small inconsistencies that, taken together, could ultimately undo their credibility. When judgment based on one feature of a person or thing affects the overall impression, an implicit, unconscious bias is at work.
Hindsight bias
People have been saying, “I told you so,” for as long as humans have been talking. We like to think the world is knowable and stable, and that we can predict what will happen next. When the future turns out differently because of unpredictable events, our reaction is often to say, “I knew it all along.”
Hindsight bias is an unconscious cognitive tendency to overestimate our ability to predict outcomes, after the fact. People say something was foreseen by them when it was not. Hindsight bias causes distortions of our memory of what we actually knew before the event occurred. When we think we “got it right” based on a false or distorted memory, we confidently entertain the notion that we can predict the outcome of future events. Stockbrokers, sporting event gamblers and any winner who attributes their success to skill while ignoring the role of luck are all victims of the bias.
In a shifting and unstable world, the idea that we knew it all along helps to restore coherence. Once we look back to learn and understand the reasons and causes for an event, the hindsight bias updates our memory of what we knew at the time.
In the early 2000s it was common for investors to say the tech bubble was going to burst. They had no idea when it was going to happen, and nearly everyone kept their investments in equities, which continued to climb. After the market crashed, everyone could recite the problems the market faced and explain how they knew it was going to happen. Just like nearly everyone else, they sustained heavy losses. In the aftermath, many believed they understood why it happened. Hindsight bias overcame the reality they had lost nearly half their portfolios by not anticipating the precipitous drop. Telling themselves they knew the market would crash fails to address why they left their assets exposed to the severe downturn.
It is in the subsequent examination of why an enterprise failed that the shortcut is taken. Even though the last case ended poorly, because we tell ourselves we understand why, hindsight bias leads us to confidently and optimistically take the next one. Instead of acknowledging the future is uncertain, the intuitive brain prefers the stability of a predictable world in which we understand why things happen.
An essential part of making good decisions in our personal and professional lives is having realistic assessments of the future. If we fail to learn from our experiences, our forecasts are often misplaced and wrong. Since we don’t reflect on the reason we missed the unanticipated factors in an outcome, we never understand why our past predictions might have been wrong. People have an unconscious tendency to process information and rely only on evidence that is consistent with their existing beliefs, setting aside what is not. This approach to information unintentionally causes us to minimize or overlook data available at the time of the decision that is inconsistent with our current beliefs and opinions about why we acted as we did. We miss the cues we failed to notice at the time, that led to the actual result, and have no insight about our miscues.
To anticipate and avoid hindsight bias, we can ponder the two or three different outcomes that could have been predicted before the event occurred. This will help remind us how difficult and unpredictable the decision really was at the time. Even more helpful is to locate a written record, diary or electronic correspondence discussing the factors that were actually considered when a decision was being made. Much as we might like to say we predicted an outcome, our judgment about taking future steps will be more solidly based on an accurate understanding of past failures.
Cultivating a slow-thinking brain
Understanding that bias exists in everyone, and cannot always be anticipated or avoided does not cure our tendency. Deliberative and thoughtful reflection is not always available due to the constraints of time and place. Stress, distraction and fatigue can present hurdles to careful thinking. As we hurry through our day, we often allow our intuition to run the show, and occasionally, we pay a price. Sometimes it is best to hurry up and slow down. There are strategies to encourage slow, deliberate thinking, many of which are outlined in the recent book, “Noise.” (Daniel Kahneman, Olivier Sibony and Cass Sunstein (2021, William Collins, ISBN 978-0-00-830899-5)
Giving ourselves permission to slow down is the key. Allowing ample time to improve accuracy in making complex decisions should be our rule. Granting sufficient time will improve accuracy in making complex, subjective, multifaceted decisions. Remind yourself to be careful, instead of jumping to conclusions or relying on intuition. If we can recognize when we are in a mixed emotional state, stressed or cognitively depleted, we can delay important decisions until we have returned to our baseline.
Accuracy can be increased by providing more time for tasks that are open to bias. The ability and willingness to be careful and alert is required. We may even want to give ourselves an instruction to slow down and take care. For important decisions you regularly make, use a checklist to help guide decision making. Explicitly noticing the potential for bias is the best way to counter it. Deliberation should come before decisiveness.
For important decisions, examining the issue with an “outside view” should identify an appropriate reference class or group of similar cases for the topic being studied. If the problem at hand can be put in a larger category and becomes part of a reference class, it allows for comparisons and categorizations, paving the way for a statistical (objective) analysis. Rather than rely on one previous case as a likely roadmap, think about five or ten similar cases that you or your firm have handled. The comparisons will provide objective data from a class of cases that will encourage an objective, outside view.
When the decision maker is able to consider average results from a reference class, statistical outcomes can provide a basis upon which to make a meaningful decision or prediction for a new case. A purposeful and objective analysis brought to your case will encourage the slow-thinking brain to make adjustments as new information is brought to light, instead of the fast-thinking brain’s intuitive discounting of new facts not supportive of the initial mental model.
Breaking up complex problems
Another deliberative practice to encourage slow thinking is to decompose complex problems into smaller questions or independent steps before reaching a conclusion. Decomposition requires creating several smaller questions Kahneman labels “mediating assessments” that can be individually and independently evaluated apart from the whole. The process begins with the assessment and identification of the complex issues of the case, followed by a subsequent evaluation using a grading or rating standard based on percentages. Appraising each mediating assessment independently will trigger the conscious reflection of the slow-thinking brain.
Each assessment answers just one question after the problem is decomposed from the whole. Once the separate assessments are scored, the decision makers can ask how each answer argues for or against the question. In a group, each assessor should be careful to refrain from general comments about the overall issue and avoid summary narratives or comprehensive conclusions.
Consider the best practices for your decision process and design guardrails to avoid bias. Absent structure and consideration in the decision-making process our judgment will yield less predictive value than it should. As discussed above, if we hurry and fail to allocate sufficient time and mental resources to the evaluation, there are various cognitive tendencies that will naturally and automatically come into play. By welcoming contradictions, and committing to updating, reconsidering and constantly improving our decision-making process, cognitive errors from reliance on intuition and mental shortcuts can be minimized and better decisions will follow.
Understanding that fast, intuitive thinking combines a sense of cognitive ease with illusions of truth, we can begin to resist gut decisions for important questions and avoid the errors that may follow. We can pause to look more deeply at the facts and evidence. Slow, deliberative thinking is occasionally tedious, but typically yields the best results. It is likely your clients and your practice will benefit from the extra effort.
Ernie Long
Ernie Long has over 30 years of experience in alternative dispute resolution, serving as a mediator, arbitrator, and settlement conference judge pro tem in most California venues. In the last 12 years he has mediated well over 2,000 disputes involving a broad range of civil litigation, including personal injury and wrongful death, business litigation, professional malpractice, probate, real estate, employment disputes, property damage, habitability, farming and ranching claims, partnership disputes, and public entity liability. He has been A-V Rated by Martindale-Hubbell since 2003, and is a member of the Sacramento Valley Chapter of ABOTA. ErnestALongADR.com.
Copyright ©
2025 by the author.
For reprint permission, contact the publisher: www.plaintiffmagazine.com