Wisconsin Lawyer
Vol. 78, No. 12, December
2005
Improving the Odds of Success:
Quantitative Methodology in Law Practice
Quantitative methodology has many applications for lawyers in all
practice areas - from determining whether to accept a particular client
to weighing the value of a case to determining strategy in terms of
possible outcomes to proving facts and more. Many business, medical, and
other sophisticated clients already use these methodologies. Now you can
"speak" their language to improve your odds of success.
Sidebar:
by Ramesh C. Sachdeva & Daniel D. Blinka
ost attorneys did not select law school
because of strong mathematics backgrounds, yet quantitative methodology
(methods to measure data) is becoming increasingly important in diverse
areas of law practice. For decades graduate and business schools have
featured, if not required, quantitative methodology in the social
sciences, business, finance, and economics. Yet overall, the legal
profession and the nation's law schools have lagged behind, even though
many clients fluently "speak" the language of quantitative methodology
and an increasing number of legal problems beg for its application.
This article surveys some of the current uses of quantitative
methodology in law and projects several potential applications while
providing a basic introduction to the subject. From the legal
perspective, quantitative analysis is much more than a species of expert
evidence at trial. And from a mathematics perspective, it is far more
than statistics. Although its content and form certainly differ from the
analogical reasoning that is the mainstay of modern legal education,
quantitative analysis ultimately complements and enhances traditional
legal analysis. We will begin by examining how quantitative
methodologies have been used thus far in Wisconsin case law and, in the
second half of the article, explore their potential for use in analyzing
a variety of client issues.
What are the practical applications of quantitative methodology for
lawyers? First, and most familiar, a wide variety of quantitative
methodologies, particularly statistical analyses, are used to prove such
"facts" as employment discrimination, negligence, product defects, and
causation. Second, a lawyer who understands these methodologies is
better able to communicate not only with experts retained to testify or
to provide advice but also with clients who regularly employ
quantitative analyses. For example, decision tree analysis facilitates
an objective evaluation of risks, outcomes, and expense whether the
client is contemplating litigation or a pending transaction. And
although its name connotes entertainment, game theory provides
objective, seriously sophisticated tools that enhance an understanding
of issues in complex litigation (or transactions) and promote rational
settlement decisions. The remainder of this article explores the uses of
quantitative methodology in existing law and outlines future
applications.
State and federal case law reflects the fits and starts by which
lawyers have struggled to employ quantitative methodologies in a host of
settings. In criminal cases one finds statistical analyses being used to
identify (or exclude) suspects using DNA testing or, in a more troubling
case, to calculate the "odds" that three young children died of natural
causes while entrusted to the care of an allegedly homicidal
babysitter.1 Despite its key role in some
criminal prosecutions, quantitative methodology's fullest potential is
most likely to emerge in civil litigation, which features a breathtaking
diversity of issues, greater resources of (some) parties, and
availability of extensive discovery mechanisms. Its promise as a potent
source of proof is evident in a brief survey of cases.
Daniel D. Blinka, U.W. 1978 cum laude,
Ph.D. (American History), is a professor of law at Marquette University
Law School, where he teaches and writes in the areas of evidence, trial
practice, history, and criminal law, and coteaches a course in
quantitative methodology with Dr. Sachdeva. Blinka is the coauthor of
the Wisconsin Supreme Court and Court of Appeals Digests that appear
monthly in Wisconsin Lawyer.
Dr. Ramesh C. Sachdeva, Marquette 2003 cum
laude, is vice president for quality and outcomes at the Children?s
Hospital of Wisconsin. He has been a practicing faculty physician for 11
years and is board certified in pediatrics and pediatric critical care
medicine. He is the Marquette University Law School Boden Research
Fellow and an adjunct law professor, teaching courses in quantitative
methodology and health care fraud. He has a Ph.D. (Epidemiology) from
the University of Texas School of Public Health, is the recipient of
many national grants, and has been funded for outcomes research focusing
on the use of game theory applications in health care from the National
Institutes of Health. He is a frequent author and lecturer at many
national and international forums.
Comparative Risk Evidence
Several cases have raised issues regarding the admissibility of
comparative risk evidence in tort actions. Although seemingly pointed in
different directions, the cases underscore the importance of carefully
scrutinizing the relevance and helpfulness at trial of quantitative
methodologies.
Bittner v. American Honda2
involved a plaintiff who sustained serious injury when his Honda
all-terrain vehicle (ATV) rolled over. Honda introduced a staggering
variety of comparative risk evidence in an effort to prove the ATV's
relative safety. The Wisconsin Supreme Court upheld the admissibility of
statistical evidence showing the risk of injury associated with the use
of other recreational vehicles, including snowmobiles, trailbikes, and
minibikes - that is, similar products. Reversible error occurred,
however, when Honda's expert "compared the risk of injury and death
associated with ATVs to the risk of injury and death associated with
products and activities including skiing, bicycle riding, scuba diving,
football, and passenger automobiles."3
Although the court did not confront head-on the admissibility of
quantitative methodologies, it concluded that risk analysis of
"dissimilar products and activities" was unfair, inappropriate, and
should have been excluded. In short, the Bittner court applied
traditional standards governing "similar accident" evidence to the
quantitative risk assessment developed by Honda's expert, resulting in
an uneasy blend of new methodology and well-worn doctrine.
Johnson v. Kokemoor,4 decided
one year later, upheld the use of comparative risk evidence and
illustrates how quantitative "thinking" in other professions and
practices affects the law. Johnson sued her doctor, Kokemoor, for
failing to inform her adequately of the risks associated with surgery to
remove a brain aneurysm. A jury returned a verdict in Johnson's favor,
finding that "a reasonable person in the plaintiff's position would have
refused to consent to surgery by the defendant if she had been fully
informed of its attendant risks and advantages."5 On appeal Kokemoor argued that the circuit court
erred by admitting evidence of his limited experience in performing this
type of operation, which he had failed to disclose fully, and a
comparison of the "morbidity and mortality rates for this type of
surgery among experienced surgeons and inexperienced surgeons like
himself[.]"6 Specifically, Kokemoor told
Johnson that the "risks associated with her surgery were comparable to
the risks attending a tonsillectomy, appendectomy or gall bladder
operation." To underscore the point, he placed the "risk of death or
serious impairment associated with her surgery at two [2] percent." Yet,
although Kokemoor pegged the risk of a bad outcome at 2 percent, the
very medical studies he had relied on "reported morbidity and mortality
rates of fifteen [15] percent" for even the most accomplished surgeons,
and other evidence fixed the rate at 30 percent when the surgery was
performed by a doctor of Kokemoor's limited experience.7
The Wisconsin Supreme Court, in a decision written by Chief Justice
Abrahamson, upheld the admissibility of evidence concerning Kokemoor's
limited experience and the relative risks of morbidity and mortality.
Cautioning that informed consent cases are necessarily "fact-driven and
context-specific," the court stopped short of "always requir[ing]
physicians to give patients comparative risk evidence in statistical
terms to obtain informed consent." Nonetheless, "[t]he fundamental issue
in an informed consent case is less a question of how a physician
chooses to explain the panoply of treatment options and risks necessary
to a patient's informed consent than a question of assessing whether a
patient has been advised that such options and risks exists."8 Kokemoor himself had "elected to explain the risks
confronting the plaintiff in statistical terms" because, as he
explained, "numbers giv[e] some perspective to the framework of the very
real, immediate, threat that is involved with this condition." And
having elected to present the risks in statistical terms, Kokemoor could
not complain when the plaintiff demonstrated that Kokemoor had
"dramatically understated" those risks by also using statistical
evidence.9
Bittner and Kokemoor teach that as powerful as
quantitative methodology may be as an analytic tool outside the
courtroom, when offered as proof at trial it must conform to the
gatekeeping standards of evidentiary rules. Bittner effectively
holds that comparative risk evidence about dissimilar products is unfair
and confusing, yet invites its use in situations in which similarity can
be established. Put differently, the supreme court broadly approved the
quantification of "other accident (acts)"-type evidence.
Kokemoor, the more compelling of the cases, illustrates that
when parties themselves have elected to rely on quantitative assessments
in the underlying event (here, the risk of surgery), discussion at trial
of the quantitative assessments is also unavoidable, although such
evidence must be channeled through the gates regulating expert testimony
and relevancy. And in order to meet these standards, the lawyer must
have a firm understanding of the underlying methodologies.
Toxic Torts and Discrimination Cases
Quantitative analysis often plays a central role in environmental
hazards and toxic tort litigation. At the federal level, the U.S.
Supreme Court's much-discussed, path-breaking decision in Daubert v.
Merrell Dow Pharmaceuticals involved the admissibility of
epidemiological evidence to prove that a drug had caused the birth
defects at issue.10 When direct evidence of
a causal link between a product (for example, a drug or toxin) and an
injury (for example, neurological damage) is missing, epidemiology (the
quantitative analysis of disease patterns in human populations) is often
used to bridge the gap with the use of statistical analyses, such as the
"standardized mortality ratio" (SMR).11 In
assessing the sufficiency of epidemiological evidence to establish
causation, courts closely scrutinize the strength and consistency of the
purported association along with its "coherence" (the elimination of
other factors), statistical analyses that usually demand expert
consultation.12
Cases featuring comparative risk analysis or epidemiological proof
pre-sent seemingly exotic species of litigation, but employment
discrimination cases vividly illustrate a body of legal doctrine that
has embraced (or, depending on one's view, been co-opted by)
quantitative methodology, particularly statistical analyses.
Quantitative analysis is the norm, not the rare exception, in many types
of employment cases and often opens the way for resolution at the
summary judgment stage. In recent litigation involving claims of racial
discrimination against United Parcel Service (UPS), the Eighth Circuit
upheld summary judgment in favor of UPS largely because of the
shortcomings of the plaintiffs' statistical evidence.13 For example, the court held that when the
plaintiffs relied solely on regression analyses to prove discrimination
in salary, those analyses must "show a gross statistical disparity and
this must be a proper case - a case in which the gross disparity can
give rise to a reasonable inference that paying blacks less because they
are black is UPS's standard operating procedure." The plaintiffs'
regressions fell short of the mark because they failed to adequately
consider past pay and performance as well as adequately account for
other explanatory variables.14
The examples could be easily extended. The point is simply that
quantitative analyses are a potentially powerful source of evidence that
is supported by case law. Yet beyond the case law and problems of proof,
quantitative analysis promises to assist lawyers and clients in more
tangible ways. In the next section of the article, we survey several
well-established quantitative methods and suggest how they may assist
lawyers.
Decision Trees and Game Theory
As discussed earlier in this article, the application of quantitative
methods in law is more than evidence and statistics. Quantitative
methods affect the development of optimal strategies and facilitate
decision making in the face of uncertainty.
Modern business organizations in America have shown a renewed
interest in applying the writings of ancient military philosophers while
developing organizational strategies and tactics for contemporary
organizations.15 This new direction is
based on the premise that although "[i]ntuition plays an important role
in decision making, ... it can be dangerously unreliable in
complicated situations."16 It has been
argued that decisions that deal with complex problems involving
evaluation of many choices and consequences exceed the ability of a
human mind to process such choices.17 This
results in the decision maker erroneously relying on simplified choices
and relying on intuition.18 It has been
further argued that relying on one's intuition is unreliable in complex
situations.19 Therefore, it is important
that the "intuitive capabilities" of the decision maker be supplemented
with "computational decision-support tool[s]."20
Like business people who must make strategic decisions and choices in
the face of uncertainty, lawyers also make strategic decisions and
choices with limited information and with an element of uncertainty.
Decision tree analysis can be used to facilitate a variety of day-to-day
decision making by lawyers both in the litigation and transactional
areas. Decision tree analysis has been applied to objectively assess the
risks and costs associated with a case, to estimate contingency fees,
and to assist with the development of litigation strategy.21
An example of a simple decision tree to facilitate decision analysis
is illustrated in Figure
1. As shown in this figure, the decision maker has to make a choice
between two options - Choice 1 or 2 at the decision node (represented in
the decision tree as a square). Each choice is associated with one of
two likely outcomes (Outcomes I - IV), with associated probabilities
(p). Probability is the likelihood of an event occurring, with 1.0
representing 100 percent certainty of the event occurring and 0.0
representing 0 percent chance of that event occurring. A chance node
(represented in the decision tree as a circle) analyzes the impact of
chance; the two branches stemming from the chance node are called chance
branches. Finally, the terminal node (represented in the decision tree
as a triangle) measures the value (or payoff) of a final outcome. All
the possible outcomes from a chance node must be illustrated in the
tree, and accordingly, the sum of the probabilities for each set of
chance branches must be 1.0. There are several decision analytic
software programs available for constructing decision trees. Data 3.5,
developed by TreeAge Software Inc., was used to develop the decision
tree in Figure 1. The software allows rolling back the tree - that is,
computing the expected value, which is the average payoff that could be
expected in repeated decisions in similar fact scenarios.
Decision analysis provides a systematic framework to facilitate
decision making when one is dealing with uncertain factors.22 The decision analytic approach is explicit,
quantitative, and prescriptive.23 The
process forces the decision maker to analyze a complex problem in small
components and then to analyze the full problem in a meaningful
manner.24 Furthermore, because the process
is quantitative, the decision maker has to explicitly quantify the value
attributed to choices.25
Figure
2 illustrates a hypothetical scenario in which a lawyer has to
recommend to a client whether or not to settle a case. As illustrated in
Figure 2, the decision involves two choices - litigate or settle. In
this hypothetical, the plaintiff can accept the $500,000 settlement
offer right away or litigate the case, which will likely take one more
year before the litigation is complete. It is estimated that juries in
this jurisdiction have awarded $1 million for similar cases and the
likelihood of winning the case in a jury trial is 70 percent. Based on
previous experience with the opposing party, it is predicted that there
is almost an 80 percent chance of the party appealing the decision if
the jury awards $1 million. Further, based on expert opinion, the chance
of winning an appeal is about 60 percent. It is further estimated that
trial costs of $100,000 will likely be incurred.
Figure
3 shows the results from rolling back the decision tree.
The expected value of settling the case is slightly higher ($500,000
versus $476,000) than litigating it. Once the trial costs are factored
in, the expected value of settling becomes even greater. Also, because
the net present value (NPV) of compensation in the future is less than
the value of the same compensation in the present, the strategy to
settle is the preferred approach.
Decision tree analysis enhances intuitive decisions in a number of
ways. First, the visual nature of the decision tree analysis can be used
effectively to facilitate discussions with clients to evaluate options.
Second, by explicitly requiring the assessment of probabilities, the
lawyer and client are forced to assess the risks associated with
individual components within a decision. Third, changing the
probabilities within the decision tree and conducting a sensitivity
analysis can efficiently evaluate the impact of a change in risk within
a specific component of a larger decision. Finally, a decision tree
analysis in complex scenarios with multiple options can be invaluable to
assist intuitive decision making.
Decision analysis is not limited to decision tree analysis.
Management science or operational research (OR) is a distinct scientific
field that aims to facilitate decision making using a scientific,
logical, and rational paradigm. Stemming from concepts emerging in game
theory, OR gained increasing popularity during World War II, when
mathematical approaches within OR were successfully used to identify
optimal options and strategic choices.25
OR has been used successfully in a variety of service and
manufacturing industries. Traditionally, OR included hard (more
quantitative) techniques such as system dynamics, queuing theory, and
simulation.26 Recently, there has been a
shift toward using soft (more qualitative) OR, such as cognitive mapping
and soft system methodology, to better assist with problem formulation
and mapping the relationships between various constructs.27 For example, a combination of system dynamics
and cognitive mapping was successfully used for "decision support"
during the litigation and settlement process of a claim with a potential
value of 3.5 billion French francs caused by the alleged disruption and
delay of constructing the "channel tunnel link between England and
France."28 (Please see the accompanying
sidebar, "Definitions of Operational Research (OR) Terms.")
A combination of hard and soft OR methods has been shown to enhance
the ability of the decision maker to use results emerging from
statistical and outcomes analysis. This enhanced ability has resulted in
greater understanding and acceptance of findings emerging from such
decision support models even by individuals who are not familiar with
the scientific methods.29 This observation
shows that such decision analytic models can be successfully understood
and adopted by individuals without requiring specific knowledge in the
scientific and mathematical fields.30
Quantitative methods are increasingly being used in health care. As
illustrated in Figure
4, the Institute of Medicine identified six dimensions of quality of
health care. However, a significant challenge remains with respect to
how to quantify and measure these quality dimensions, issues that
represent a major focus of many national initiatives. At the Children's
Hospital of Wisconsin in Milwaukee, these six dimensions have been
operationalized (implemented in specific clinical settings) to
quantitatively measure the quality and outcomes of health care as
illustrated in Figure 4 (these findings have been presented at several
national meetings across the U.S. during 2004-05). The important aspect
of this quantitative application is that it allows a link to the
Plan-Do-Study-Act Quality Improvement cycle, which has widespread
acceptance in hospitals. Indeed, the cycle has been adopted by the
Institute of Healthcare Improvement because it measures, among other
things, the improvement in health care in a quantitative manner over
time.
Undoubtedly, quantitative methods may have new applications in a
variety of previously unexplored areas within the law. To take one final
example, a recent, provocative article contends that computer models can
be used to map U.S. Supreme Court opinions over time with the objective
of identifying a "main core" of precedents that dominate certain legal
issues. Such mapping, some people speculate, may one day yield Ronald
Dworkin's "Hercules," an ideal judge with perfect knowledge of every
decided case.31
Conclusion
Case law recognizes the potential of quantitative methodology as a
means of proof at trial, yet the potential remains largely untapped.
Once lawyers familiarize themselves with this type of evidence, its
usefulness and power will be more fully appreciated. Yet quantitative
methodology is not confined to litigation. Game theory and decision tree
analysis are ways of thinking commonly used by other professionals in
their day-to-day conduct of medicine or business. And for that very
reason, practicing lawyers will find it an enormously effective means of
communicating with clients about complex transactions or litigation
strategies.
1State v. Peters, 192 Wis.
2d 674, 534 N.W.2d 867 (Ct. App. 1995) (court admitted population
statistics involving a DNA match); State v. Pankow, 144 Wis. 2d
23, 422 N.W.2d 913 (Ct. App. 1988) (court allowed evidence that
quantified "odds" of three children dying of natural causes while in the
care of an unrelated person).
2Bittner v. American Honda
Motor Co., 194 Wis. 2d 122, 533 N.W.2d 476 (1995).
3Id. at 150-51.
4Johnson v. Kokemoor, 199
Wis. 2d 615, 545 N.W.2d 495 (1996).
5Id. at 620-21.
6Id. at 621.
7Id. at 644.
8Id. at 646-47.
9Id. at 647.
10Daubert v. Merrell
Dow Pharm. Inc., 509 U.S. 579 (1993).
11Phillip Good, Applying
Statistics in the Courtroom: A New Approach for Attorneys and Expert
Witnesses 97-98 (2001). According to Good, a standardized mortality
ratio (SMR) of 1.0 is the expected incidence of disease regardless of
exposure to the suspected toxin; an SMR of 2.0 means that the toxin was
as likely as not to have caused the damage, and an SMR greater than 2.0
means that the toxin was more likely than not the cause.
12Id. at 100-03.
13Morgan v. United Parcel
Serv. of Am., Inc., 380 F.3d 459 (8th Cir. 2004), cert.
denied, 125 S. Ct. 1933 (2005).
14Id. at 469-70.
15See Sun Tzu, Art
of War 18 (Ralph D. Sawyer trans., Westview Press 1994).
16See Eric Bonabeau,
Don't Trust Your Gut, Harv. Bus. Rev., May 2003, at 116.
17Id. at 121.
18Id.
19Id.
20Id.
21See Marc B. Victor,
Articles Authored by Marc B. Victor, Esq., available at
http://www.litigationrisk.com (last visited Sept. 12, 2005) (indicating
the use of decision tree analysis for evaluating legal risks and costs,
minimizing guesswork in contingency fee proposals, choosing an optimal
fee arrangement, determining how much a case is worth, and assisting
with litigation strategy).
22See, e.g.,
Milton C. Weinstein et al., Clinical Decision Analysis 3 (W.B.
Saunders 1980).
23Id.
24Id.
25See, e.g.,
Fran Ackermann, Colin Eden, & Terry Williams, Modeling for
Litigation: Mixing Qualitative and Quantitative Approaches,
Interfaces, 27:2, 48 (1997).
26See id. at 49.
27Id.
28Id. at 49-50.
29See Ramesh C.
Sachdeva, Mixing Operational Research Methodologies to Achieve
Organizational Change - A Study of the Pediatric Intensive Care
Unit (2005) (unpublished Doctor of Business Administration (D.B.A.)
thesis, University of Strathclyde, Glasgow, U.K.) (on file with the
University of Strathclyde Library).
30See id.
31See Statistical
Modeling: The Wisdom of Hercules - Using Computer Models to Identify
American Jurisprudence, The Economist, Aug. 27, 2005, at 65.
Wisconsin
Lawyer