These types of problems have led to scandals at dozens of crime labs across the nation, resulting in full or partial closures, reorganizations, investigations or firings at city or county labs in Baltimore; Boston; Chicago; Colorado Springs, Colorado; Dallas; Detroit; Erie County, New York; Houston; Los Angeles; Monroe County, New York; Oklahoma City; San Antonio, Texas; San Diego; San Francisco; San Joaquin County, California; New York City; Nashville, Tennessee; and Tucson, Arizona, as well as at state-run crime labs in Illinois, Montana, Maryland, New Jersey, New York, Oregon, Pennsylvania, Virginia, Washington, North Carolina, West Virginia and Wisconsin, plus the federally-run FBI and U.S. Army crime labs. Forensic “expert” scandals have also been reported in the United Kingdom.It's not that most of the questioned techniques have been invalidated, it's more that expert witnesses overstated their reliability and high error rates were routinely minimized. If, to use an example from last week's Forensic Science Commission Roundtable, the error rate of microscopic hair comparisons runs around 11 percent, courts have historically been too ready to accept expert testimony declaring a forensic "match." They may be right 89% of the time, but used in volume the technique can and has led to false convictions, including DNA exonerations. Most judges and attorneys, PLN showed, don't understand enough about statistics to comprehend, much less judge, error-rate issues.
The origins of such problems include unqualified or incompetent lab workers, personnel using false academic credentials, contamination in labs that cause false test results, employees falsifying test results to “help the prosecution,” and lab examiners committing perjury. Contributing to these problems is a lack of qualification standards and industry-wide training requirements for lab workers.
One might think that such scandals are caused by a few bad apples in the crime lab barrel, which is the spin typically adopted by the labs themselves. That problem could be fixed by hiring qualified personnel, training them properly and providing adequate oversight. But at least the forensic science that underpins crime lab testing is sound and valid, right? In many cases, wrong.
A 2009 report by the National Academy of Sciences, the most prestigious scientific organization in the United States, revealed that much of the “science” used in crime labs lacks any form of peer review or validation – fundamental requirements for sound science. Such questionable forensic methods include long-established and accepted techniques such as fingerprint comparison, hair and fiber analysis, and bullet matching.
The PLN story walks through these topics in great detail. It's a particularly good starting point for those who haven't deeply considered these subjects.
I have over the years had the opportunity to work for a local defense attorney where my background in instrumentation, and error analysis came into play.
ReplyDeleteThe thing that is just appalling when it comes to forensics and more to the point STEM (Science, Technology, Engineering, & Mathematics) is the total ignorance that the legal profession has on the whole. There are a couple of Federal Judges here in East Texas that have solid STEM backgrounds, but these tow judges are the exceptions. I would also note that these two are Federal judges and not state judges.
I've seen enough cases tried in courts to also be totally appalled by the lack of, or the ignoring of the rules of logical inference, and logic in general that judges apply to courtroom proceedings.
What Texas is in DIRE need of are judges that are very well educated, and hold the cases, and proceedings under their courts to high standards of conduct, and rationality.
‘The Fishing Physicist’
"Most judges and attorneys ...don't understand enough about statistics to comprehend, much less judge, error-rate issues."
ReplyDeleteHardly surprising. Judges are unable to communicate to juries what statistical weight is the approximate equivalent of the beyond a reasonable doubt standard. Some academic studies have found that that standard to be in the range of 70-75%, in the minds of prospective jurors. That would mean that a 25-30% error rate might be expected in guilty verdicts. In comparison, a 90% accuracy rate for a particular test seems quite good for a one piece of evidence.
Maybe those numbers aren't good enough. But how good should they be?
7:15, 90% accuracy may be "quite good" for one piece of evidence, but when it's portrayed as a straight-up match it misrepresents what the science can actually say.
ReplyDeleteI'm not arguing the evidence should not be used, just that its probity shouldn't be overstated and that the tendency to overstate forensic results has led to false convictions.
18 USC § 3500 - Demands for production of statements and reports of witnesses
ReplyDelete"After a witness called by the United States has testified on direct examination, the court shall, on motion of the defendant, order the United States to produce any statement (as hereinafter defined) of the witness in the possession of the United States which relates to the subject matter as to which the witness has testified. If the entire contents of any such statement relate to the subject matter of the testimony of the witness, the court shall order it to be delivered directly to the defendant for his examination and use."
Attention Defense Attorneys! Get the protocols and the data underlying the expert witnesses testimony. Scrutinize the information.
Expose the charlatans and the prosecutors willing to use them!
The above is from the Jencks Act (federal).
ReplyDeleteDoes Texas state law have a similar condition? That is, can documents be obtained post-testimony?