Basically, the book expands on work by the NAS in their 2009 report on the science (or lack thereof) behind forensics commonly used in criminal courtrooms, creating a reference manual for judges that combines the latest scientific assessments with an analysis of the relevant case law governing each technique discussed. Very helpful, and enlightening.
This 1,000-page tome addresses myriad aspects of scientific evidence used in courtrooms, but I thought I'd start with a discussion of the section on "Firearms Identification Evidence," or "ballistics," which have been used as identifiers in court dating back to the 1920s. "In 1923, the Illinois Supreme Court wrote that positive identification of a bullet was not only impossible but 'preposterous.' Seven years later, however, that court did an about-face and became one of the first courts in the country to admit firearms identification evidence. The technique subsequently gained widespread judicial acceptance and was not seriously challenged until recently." (Citations omitted.)
The 2009 NAS report found that "Sufficient studies have not been done to understand the reliability and repeatability of the methods" for matching fired bullets or cartridges with the originating weapon, but the studies that have been done certainly give one pause. Tests in 1978 by the Crime Laboratory Proficiency Testing Program found a 5.3% error rate in one case and a 13.6% error rate in another. Experts evaluating those errors called them "particularly grave in nature." A third test by the same group found a 28.2% error rate.
Later proficiency testing produced lower error rates, but "Questions have arisen concerning the significance of these tests." Only firearms experts in accredited labs participated in testing, for starters, and they weren't "blind" studies: I.e., participants knew they were being tested. Some proficiency testing even reported zero errors, but in 2006, the US Supreme Court observed, "One could read these results to mean the technique is foolproof, but the results might instead indicate that the test was somewhat elementary."
Then, "In 2008, NAS published a report on computer imaging of bullets" that commented on the subject of identification, concluding that "Additional general research on the uniqueness and reproducibility of firearms-related toolmarks would have to be done if the basic premise of firearms identification are to be put on a more solid scientific footing." That report cautioned:
Conclusions drawn in firearms identification should not be made to imply the presence of a firm statistical basis when none has been demonstrated. Specifically, ... examiners tend to cast their assessments in bold absolutes, commonly asserting that a match can be made "to the exclusion of all other firearms in the world." Such comments cloak an inherently subjective assessment of a match with an extreme probability statement that has no firm grounding and unrealistically implies an error rate of zero. (emphasis in original)In 1993, the US Supreme Court issued its pivotal Daubert ruling which proscribed new evidentiary standards for scientific evidence, but it took years for those standards to be rigorously applied to ballistics evidence. "This changed in 2005 in United States v. Green where the court ruled that the expert could describe only the ways in which the casings were similar but not that the casings came from a specific weapon." A 2008 case said an expert could not testify that a bullet matched a weapon to a "reasonable scientific certainty," but was only permitted to say that it was "more likely than not" that a bullet came from a particular weapon.
As with other comparative forensic techniques from fingerprints to bitemarks to microscopic hair examination, essentially, all ballistics experts are really saying is "After looking at them closely, I think these two things look alike." It strikes this writer that it's quite a big leap from "reasonable scientific certainty" to "more likely than not." Basically it's the leap from "beyond a reasonable doubt" to having "substantial doubt." I wonder how many past convictions hinged on testimony where experts used phrases like "reasonable scientific certainty" or "to the exclusion of all other firearms in the world"? And I wonder how many times those experts were simply wrong?