I just ran across a lengthy, excellent article on the sources and frequency of error in fingerprint forensics, published in the online magazine MillerMcCune.com last month: "
Bias and the Big Fingerprint Dustup" (by Sue Russell, June 18). Here's how the story opens:
In 2004, cognitive neuroscientist Itiel Dror and Dave Charlton, a veteran fingerprint examiner and doctoral candidate, chatted over coffee in a Brighton hotel suite after a gala dinner at the U.K. Fingerprint Society's conference. Charlton was upset. Months earlier, [Dr. Itiel] Dror had designed a study to see if fingerprint examiners' decisions on matches might unconsciously be biased by information they received about a case.
Would examiners change their opinions about prints they'd called matches five years earlier, Dror wondered, if they viewed them again in a different context?
Charlton, supervisor of a U.K. police department's fingerprint lab, editor of the Fingerprint Society's journal Fingerprint Whorld and a true believer, was certain they would not. He urged Dror not to waste his time.
But Dror was insistent: "I said, 'Indulge me! Let's do it.'" And so, five international experts were put to the test covertly, re-examining matched prints from their own old cases while armed with different — and potentially biasing — "case information." They'd agreed to be tested, but they didn't know when — or even if — test prints would cross their desks.
That night in Brighton, the results were in. For Charlton, they were a jaw-dropper.
"Not only some, but most, of the fingerprint examiners changed their minds," said Dror, who was far less surprised by the flip-flopping. As an expert in human thought processes and the hidden power of cognitive bias — an array of distortions in the way humans perceive reality — he had a decided advantage.
Fingerprints have been accepted as unassailable evidence in courts for more than 100 years, but vaunted claims of their uniqueness and infallibility still lack scientific validation. By contrast, the existence of cognitive bias and the subjective nature of human judgment have been thoroughly established by hundreds of scientific studies going back decades.
What's more, the experiment was replicable with different fingerprint examiners using different fact scenarios:
In another study, [Dror's] team had six international experts each view eight latent prints that they'd each previously examined, but now they were accompanied with a new, mundane context — something like, "the suspect has confessed," or, "the suspect is in custody." More expert reversals followed. Four of the six reached different conclusions. One changed his mind three times.
Indeed, this isn't just a bias in experimental situations but occasionally in the real world:
in a landmark U.S. case, Stephan Cowans of Boston became the first person to be convicted on fingerprint evidence, then — after serving six years in prison for shooting a police officer — exonerated by DNA. Two prosecution experts and Cowans' two defense experts (formerly of the same fingerprint unit) had all verified the match. After his 2004 release, Cowans revealed his earlier certainty about fingerprints by saying that on the evidence presented in court, he would have voted to convict himself.
What can be done to reduce cognitive bias among fingerprint examiners and other forensic workers? Dror advocates removing forensics from the purview of law enforcement and severely limiting the amount of contextual information given to forensic workers about individual cases:
A key National Academy of Science report recommendation — to move crime labs out from under law enforcement's wing and create a new national institute of forensic sciences — would surely help impartiality. If lack of funding delays that, "so be it," Dror said. "But you can't have it both ways. If there's no reform, don't say, 'I am 100 percent objective, I make no mistakes, there is no problem.'"
In the interim, some steps can be taken. When further examiners are called on to verify the work of a first, they should always examine the evidence independently without knowing the earlier results.
Efficiency, scientific validity and objectivity could also be dramatically improved for a relatively small financial outlay by establishing and enforcing "best practices" in crime labs (another NAS report recommendation.) Best practices are formally documented standard operating procedures, processes and rules for how to do your work that are specifically designed to make it effective and efficient, and avoid error. Having best practices that all fingerprint examiners everywhere must adhere to would be a big step forward, Dror believes, but only if they are science-based and validated by experts in cognitive neuroscience, psychology and thought processes.
Today, guidelines are provided by Scientific Working Group on Friction Ridge Analysis, Study and Technology. However, these are only guidelines with no mechanism of enforcement. "What is more," Dror said, "none of the current guidelines really directly and adequately deals with confirmation bias and other cognitive issues." ...
He favors the immediate implementation of the practice of withholding all nonessential crime details from forensic scientists. Detectives, investigators, lawyers, judge and jury need to know if fingerprints are related to terrorism or bicycle theft, but for fingerprint examiners counting ridge characteristics, loops, whorls and other minutiae, such context is irrelevant.
"We're not going to send a fingerprint to Interpol if somebody stole a bike," Dror said. "But we need to make sure the fact that the examiner thinks it's a terrorist or the Madrid bomber doesn't cloud their judgment too much." To Dror, it's like his personal physician needing his medical history, while the lab technician counting his white blood cells for a blood test does not.
The issue of cognitive bias comes up again and again in discussions of forensic errors. While it seems like it should be the easiest problem to fix - managing who gets what information - the reaction to such suggestions from forensic workers tend to range from defensive to openly hostile, as this excellent Miller-McCune article demonstrates. In the wake of the National Academy of Sciences report, we're going to see a revamping in the near future of methodologies in many forensic fields, and removing sources of cognitive bias must surely be a key component of any such reforms.
13 comments:
Though I agree with the recognition of problems within many, if not all, crime labs I suggest that the removal of labs from the police may not be the most appropriate way to address their inherent issues. If you accept that the labs are not capable of appropriately reviewing evidence in an unbiased manner then how does one justify a crime scene technician being a police officer, for example? That bias is also found among those officers and can be expressed in what and what doesn't get processed as evidence, the photo techniques used, controlling the crime scene, etc.
Further, if a police officer is involved in any manner, the same problems are also present as the investigator may be investigating one of his former associates or may be attempting to address PR concerns related to his department. In other words, if you accept that there is a bias that can only be addressed by removing the labs from the police, why limit "fixing" the system by only removing the labs? Why not remove everything involved in the investigative process? Same argument applies. Again, the argument is that the police are either too biased (the main point of your blog post) or too corrupt to handle the investigation of a crime - any crime. For the use of the lab is an investigative process. In some departments, for example, street officers - crime scene techs -actually do the fingerprint processing and firearms analysis because sending it out to an outside lab means the work will take forever if done at all due to higher priority lab work (such as another department's homicide) demanding precedence.
What I would suggest is that the labs, as you mentioned, be held accountable to appropriate standards (including limiting access to the investigator's hypotheses), with outside audits (not by other crime lab teams, either) including a recognition of the funding requirements needed for running a professional science laboratory. They also need to be in a separate chain for command directly to the command staff.
The real issue, as is almost always the case in poor policing, is a lack of competent management. That is where the focus really needs to be put. Thanks.
"the argument is that the police are either too biased (the main point of your blog post) or too corrupt to handle the investigation of a crime - any crime."
Not the point of my post at all - you're projecting your own biases, not arguing against anything I've written.
I think the distinction made about the doctor vs. the lab tech doing the white blood cell count was a good way to think about drawing the lines. Obviously police detectives need access to all the information to put the pieces together, but technical analysis of evidence needs to be segregated from that contextual knowledge.
Maybe it's possible to create some sort of Chinese wall and still keep the techs officially under law enforcement, but they shouldn't continue to receive case information when they're supposed to be making technical or scientific evaluations that don't require that data.
Grits - any toes left out there that you've missed lately? lol
Plato
I'm by no means any kind of intellectual, but, I don't think you need one to understand these situations. Leaving the police to "police" themselves is no better than letting the inmates run the prison. By letting biased investigaters run the show, you taint the whole investigation. And, we all know that the validity of biased evidence is no better than using a crystal ball and calling that unbiased, scientific evidence...andy
Hmm, cognitive bias. That sounds like the problem with this liberal blog.
I hesitate to respond as I'm not really in the mood for a dose of tomfoolery; but what does "liberal" have to do with the story regarding fingerprint evaluations?
If cognitive bias is problem in identifying fingerprints, it appears the iron clad status of fingerprint evidence is questionable. Cognitive bias can only mean fingerprint analysis is very subjective. The DNA Vs Fingerprint conviction reversals prove the point! Also I remember a case evolving a lawyer in Washington state being jailed by the Feds for taking part in the Madrid bombing based on finger print evidence. After some time in jail for the Washington state lawyer it was decided the print was not a match after all. I find it interesting how we find things we believed were rock solid turn out to be false. New information or real life experience show how the beliefs we grew up with turn out to be false or at least not so rock solid. This life seems to be a learn as you go process.
You people are idiots. You come up with one example of a mismatch on fingerprints, and that makes a system bad. That system has successfully matched fingerprints for decades, connecting literally millions of fingerprints to the right person.
You are just naysaying liberal wonkers who have to bitch at something to make your life worthwhile. Why does a blog have to be directed at only taking potshots? When do you people recognize good work by law enforcement? Grow up.
"One example," 8:37? Apparently you didn't actually read the article. Also, it's a little ironic for you to engage in anonymous ad hominem name calling and then complain of others taking "potshots": Pot, meet kettle ... I think you'll find you have a lot in common.
Mark#1: Your first instinct was correct - don't feed the trolls.
The American jury system (1) is unique. John Adams described it as "the heart and lungs of liberty." Our legal process entrusts the most difficult disputes to a group of people who are strangers to one another and strangers to the parties to the lawsuit. We submit crucial questions involving property, liberty and life itself to a random slice of the community's population. We call that group a "jury."
This is from a discussion of jury selection by Ray Moss on his web page - http://criminaldefense.homestead.com/JurySelection.html
If the courts allow both sides to try and eliminate bias on the jury, it would seem to me the least that could be done in the process is to try to ensure that personal bias of the professionals presenting the case, from police, labs and prosecutors presents the most accurate data to be considered. Since the question of requiring technicans to be present to be cross-examined is still up in the air, then this article brings into question when are lab results to be taken as gospel and unquestionable.
Replicable studies that can confirm test results often bring about changes and corrections that do indeed make our system "the best" as we like to think.
Grits,
Though I (Anon 1:38) disagree that I'm projecting (and was not claiming you were using corruption as a basis of your argument - just that other's tend to use it in a similar context) I do concur, though I obviously failed to express it appropriately, with the idea of your Chinese Wall and do believe that labs, even managed by police departments, need to have a stand alone chain as much as possible and limited contact between investigator and tech. The difficulty is, at times, in the details. For example, as I stated originally, what happens when it's the evidence technician (a senior police officer) that is both collecting and processing fingerprint evidence. S/he may be the one actually doing the entire investigation. That cognitive bias is still in play. Happens at small departments all the time. Check fraud is a classic example. Officer doing the investigation may be the same one that is doing the evidence processing. Obviously doesnt' apply to DNA, etc. but some "lab" procedures are actually handled by investigating officers in some departments. With funds, you can build the Wall with more people doing specific and discreet jobs but without the personnel, especially in a small department, it isn't going to happen. Thanks again.
Anon 8:37 No disrespect but there has been more than one example see Anon 7:49. As for lab screw-ups just look at the failure of HPD's lab. It was so messed up they were looking at indicting some cops - too bad an HPD officer was serving on one the grand juries - it's a little hard to go back to work after you've indicted your boss! Been decades of cases where they were successful at convicting someone, we just aren't sure it was the correct someone! Been decades of cases where eye witnesses helped convict only to find out afterwards that they also screwed-up. Line-up procedures have been used for decades only to find out that the presentation was biasing the answer of the witness. Sorry but Grits is, as usual, on the mark. As far as good police work, ever hear of the business practice of "continuous improvement"? As historically, most departments will not change their procedures except under pressure from the outside there is no improvement without those damn liberals.
That's a basic concept of liberalism - challenging the status quo - inherently a conservative concept.
There is a similar article in the August Popular Mechanics on forensics in general. Not only fingerprints, but most other forensics have a bad record of reliability. Also, an article in the July/August Discover magazine on Memory -and how it changes - questions just about all eyewitness trial statements as unreliable. Please read these articles and comment. They are real eye-openers.
Remove everything involved in the investigative process from law enforcement. Anonymous said it. Sounds good to me.
Post a Comment