We have assembled a set of influential and seminal papers relevant to the quantification of the weight of evidence. We invite suggestions to expand this list. Please send your suggestions to tc.forensicweightofevidence [at] nist.gov (tc[dot]forensicweightofevidence[at]nist[dot]gov).
Background:
Parker (1966) and Evett (1977) are among the early papers on the statistical methods used to quantify the weight of forensic evidence. Their methods are based on testing the hypothesis of having two sets of glass fragment evidence from the same source versus the hypothesis of having them from different sources. A classic paper by Lindley (1977) gives a Bayesian viewpoint on the ratio between the same-source probability and the different-source probability. The ratio is also the ratio of the posterior odds to the prior odds, and is referred to as a Bayesian factor, or likelihood ratio. The likelihood ratio provides a qualitative approach to weigh the evidence and to arrive at the posterior odds of determining the same source versus different sources. If the prior odds can be assumed to be 1, the likelihood ratio is the same as the posterior odds. It also uses probabilities of evidence under the viewpoints from both the prosecution side and the defense side. The likelihood ratio has gained popularity since Lindley's paper. As a valuable tool, the likelihood ratio has been used on matching glass fragments in Aitken and Lucy (2004) by accounting for multivariate elemental ratios within the fragments. It has since been extended to address source-matching and classification in other forensic disciplines such as hair, fiber, DNA profiling, handwriting, and fingerprinting. Peabody (1983) and Aitken (1986) discuss the use of the likelihood ratio in discriminating animal hair samples with parametric and nonparametric estimation techniques. In Evett, et al. (1987), the numerator and the denominator are calculated based on the hypotheses of whether the fiber sample comes from the same source as a control sample or from a different source. Berry (1991) and Berry, et al. (1992) investigate the likelihood ratio of guilt vs. innocence by incorporating DNA information. Weir (2007) presents recent advances in the likelihood ratio estimation in DNA profiling in forensics and discusses potentially complex statistical issues related to family and population. Jackson, et al. (2006) presents a general view on how to use prior odds, the likelihood ratio, and the resulting posterior odds in the court proceedings.
Bozza (2008) constructs the likelihood ratio using multivariate continuous variables related to handwriting. Saunders, et al. (2011) built classifiers based on the categorical types of handwriting, and Hepler, et al. (2012) developed a method to calculate the likelihood ratio from similarity scores of handwritten documents. Fingerprinting for individualization to a specific suspect is found to be more complex than DNA profiling (Stoney, 1991). Neumann, et al. (2012) developed a likelihood ratio approach in fingerprinting based on multivariate minutiae attributes. The method mainly aims at identifying fingerprints from the same finger. Neumann, et al. (2011) considers a similar approach for ten fingers.
General
Trial By Mathematics: Precision and Ritual in the Legal Process
Laurence H. Tribe
Harvard Law Review Volume 84 Number 6, April 1971.
Lay Understanding of Forensic Statistics: Evaluation of Random Match Probabilities, Likelihood Ratios, and Verbal Equivalents
William C. Thompson and Eryn J. Newman
Law and Human Behavior, Vol 39(4), Aug 2015, 332-349
Royal Statistical Society Guidelines
The guides look at communicating and interpreting statistical evidence in the administration of criminal justice. They are intended to assist judges, lawyers, forensic scientists and other expert witnesses in coping with the demands of modern criminal litigation.
The Nature of Forensic Science Opinion –A Possible Framework to Guide Thinking and Practice in Investigation and in Court Proceedings
Graham Jackson, Christophe Champod, Ian W Evett
Advance Forensic Science, Dundee, UK
Fingerprint Source Book
U.S. Department of Justice Office of Justice Programs
Statistical Methods
Equal prior probabilities: Can one do any better?
A. Biedermann, F. Taroni, P. Garbolino
Forensic Science International 172 (2007)85–93
Traditional conclusions in footwear examinations versus the use of the Bayesian approach and likelihood ratio: a review of a recent UK appellate court decision
William J. BodziakLaw,
Probability and Risk (2012) 11,279–287
Inadequacies of posterior probabilities for the assessment of scientific evidence
Taroni F.
Law, Probability and Risk (2005) 4, 89−114
Measuring the validity and reliability of forensic likelihood-ratio systems
Geoffrey Stewart Morrison
Science and Justice 51 (2011) 91–98
The likelihood-ratio framework and forensic evidence in court:
a response to R v T
Geoffrey Stewart Morrison
Forensic strength of evidence statements should preferably be likelihood ratios calculated using relevant data, quantitative measurements, and statistical models – a response to Lennard (2013) Fingerprint identification: how far have we come?
Geoffrey Stewart Morrison and Reinoud D. Stoel
Australian Journal of Forensic Sciences, 2014
Vol. 46, No. 3, 282–292,
Statistics in forensic science
James M. Curran
Assessing uncertainty in DNA evidence caused by sampling effects
J.M. Curran, J. S. Buckleton, C.M. Triggs, B.S. Weir
Confidence Interval of the Likelihood Ratio Associated with Mixed Stain DNA Evidence
Gary W. Beecham,Ph.D. and Bruce S. Weir, Ph.D.
J Forensic Sci, January 2011, Vol. 56, No. S1 doi:10.1111/j.1556 4029.2010.01600.x onlinelibrary.wiley.com
Statistical Interpretation
Extending the Confusion About Bayes
Bernard Robertson, G. A. Vignaux and Charles E. H. Berger
Uncertainty and LR: to integrate or not to integrate, that's the question
M.J. Sjerps, I. Alberink, A. Bolck, R. Stoel, P. Vergeer and J.H. van Zanten
Oxford Journals the Law, Probability and Risk
Dismissal of the illusion of uncertainty in the assessment of a likelihood ratio
Franco Taroni, Silvia Bozza, Alex Biedermann, and Colin Aitken
Oxford Journals the Law, Probability and Risk
Forensics without uniqueness, conclusions without individualization: the new epistemology of forensic identification
Simon A. Cole
Law, Probability and Risk (2009) 8, 233−255
Discussion paper: Hard cases make bad law—reactions to R v T
William C. Thompson
Law, Probability and Risk (2012) 11, 347–359
Rejoinder
Franco Taroni, Silvia Bozza, Alex Biedermann, and Colin Aitken
Law, Probability and Risk 2016 15: 31-34
The Role of the Subjectivist Position in the Probabilization of Forensic Science
Alex Biedermann
Journal of Forensic Science and Medicine, Vol 1, Issue 2,, July 2015, pp 140–148.