Monday, June 16, 2008

Accrediting handwriting "experts" may not prevent wrongful convictions

Some improvements mandated by the Legislature at Texas crime labs apparently are taking a while to filter down into daily practices, to judge by this notice (doc) from the Department of Public Safety, found at the Texas prosecutors' website. DPS says many prosecutors still aren't aware that only handwriting experts from accredited labs can have their forensic analysis presented in court:
Texas law enforcement agencies and prosecuting attorneys need to be aware that anyone presenting court testimony in a criminal trial regarding handwriting examination must work in a laboratory accredited by the Department of Public Safety, under the auspices of ASCLD, Inc.

The Department further requires that the laboratory be accredited by a nationally recognized accrediting body. DPS maintains a list of the accredited laboratories on its web site at www.txdps.state.tx.us. Search under “accreditation” to locate this list.
The accreditation requirement resulted from bills adopted by the Texas Legislature in 2003 and 2005 (article 38.35 of the Code of Criminal Procedure and section 411.0205 of the Government Code). Since adoption of these laws, some confusion has existed over which forensic analyses require accreditation.

Forensic analysis of questioned documents (for example, printed documents and handwritten documents) must be performed in an accredited laboratory. Examinations not performed in an accredited laboratory do not meet evidence admissibility requirements of Texas law. (emphasis in original)
Only two crime labs in Texas are accredited to evaluate "questioned documents" - the Houston PD crime lab and the DPS crime lab in Austin. Indeed, since the Houston crime lab has a limited client base (to go along with a well-earned reputation for sloppy work), the Austin DPS lab, for most Texas agencies, is their only real-world option. But is mere accreditation enough when the basis of the craft itself has been called into question? The NYC Innocence Project points out that:
Scientific errors, fraud or limitations were a factor in 63% of the first 86 DNA exoneration cases, according to an August 2005 analysis of the cases published in Science magazine. These forensic science mishaps include everything from lab analysts who committed fraud to expert witnesses who relied on analyses of forensic disciplines which have never been adequately validated to identify a perpetrator such as: hair, bullets, handwriting, footprints, or bite marks. Using DNA – which provides a precise identification that other methods cannot – wrongful convictions were exposed years or even decades later.
Handwriting analysis unfortunately is one of the softer comparative forensic procedures that's widely hailed by law enforcement advocates but enjoys little support based on independent testing of field results. According to a recent academic analysis published earlier this year by Paul Gianelli of Case Western Law School:
forensic labs show unacceptably high error rates in certain fields. Several types of examinations caused concern: fiber, paint, glass, and body fluid comparisons resulted in incorrect results in more than ten percent of the cases. A review of five handwriting comparison proficiency tests in 1987 showed that, at best, “[d]ocument examiners were correct 57% of the time and incorrect 43% of the time.”
Even the requirement for using accredited labs seems insufficient when the entire field of handwriting anlaysis has been labeled "junk science" that produces staggeringly high error rates. Gianelli found proficiency testing for examiners in these comarative forensic fields to be highly suspect:
In addition, the rigor of some voluntary proficiency tests is suspect. For example, a fingerprint examiner from New Scotland Yard testified in one case that the FBI proficiency tests were deficient: “It’s not testing their ability. It doesn’t test their expertise. I mean I’ve set these tests to trainees and advanced technicians. And if I gave my experts these tests, they’d fall about laughing.” The district court agreed, noting that “the FBI examiners got very high proficiency grades, but the tests they took did not. . . . [O]n the present record I conclude that the proficiency tests are less demanding than they should be.” The FBI’s own report acknowledged this shortcoming. Similarly, in a trial involving handwriting comparisons, the court wrote:
There were aspects of Mr. Cawley’s testimony that undermined his credibility. Mr. Cawley testified that he achieved a 100% passage rate on the proficiency tests that he took and that all of his peers always passed their proficiency tests. Mr. Cawley said that his peers always agreed with each others’ results and always got it right. Peer review in such a “Lake Woebegone” environment is not meaningful.
If proficiency programs are not rigorous, they provide only an illusion of reliability. Indeed, by bestowing an undeserved imprimatur, they are affirmatively misleading.
In this "Lake Woebegone environment" where every forensic expert supposedly is above average, "experts" never disagree, and proficiency testing of experts fails to screen out analysts with high error rates, handwriting analysis takes on the appearance of pseudoscience - an inherently subjective instead of analytical forensic approach.

This morning I contacted Wil Young who's in charge of Quality Assurance for DPS crime labs. He told me that handwriting experts from DPS accredited labs must take a proficiency test once per year, but acknowledged that DPS and the FBI use the same accrediting group, so any deficiencies identified by the federal judge above may apply equally to DPS and HPD handwriting experts.

With a few exceptions like the federal case mentioned above, most courts have so far upheld the use of handwriting analysis when performed by accredited forensic specialists. Indeed, the accreditation requirement only applies to criminal proceedings, said Young; civil cases and federal prosecutions don't require the use of accredited handwriting examiners. But the mood of the judiciary nationally is shifting as evidence begins to mount that some forensic science is entirely subjective with no basis in peer reviewed methodologies. As the American Bar Association Journal recognized in 2005:
many previously accepted forensic techniques--including handwriting analysis, hair comparisons, fingerprint examinations, tool mark identification, bite mark evidence and comparative bullet-lead analysis--have come under sharper scrutiny from courts. ...

“All of these fields are vulnerable,” Giannelli says, “because they’ve never done the kind of empirical research that would be demanded today under Daubert.”
The justice system remains fraught with faulty forensic science containing little or no basis in experimentation or the scientific method, and by all accounts handwriting analysis falls into that category. Error rates from 10-40% are too high to tolerate in an environment where convictions must be sustained beyond a reasonable doubt.

MORE: Here's a reference to a 2002 study purporting to support use of handwriting analysis, finding a 2-5% error rate when researchers had writing samples four pages long to compare. That still seems like a high error rate in practice. Identifying a forged check is a different matter, though: ''Even in laboratory settings, there is no evidence they can do it."

No comments:

Post a Comment