Wednesday, December 27, 2017

Interview: Peter Neufeld, co-founder of the national Innocence Project, on prospects for state and national forensic-science reform

In the December episode of the Reasonably Suspicious podcast, we published an excerpt from an interview Grits conducted with national Innocence Project co-founder Peter Neufeld. We mainly discussed forensic-science topics including the abolition of the national forensic science commission, of which he was a member, and DNA mixture controversies. You can listen to the full interview here.


Find a transcript of our conversation below the jump.

Transcript: National Innocence Project Co-founder Peter Neufeld interviewed by Scott Henson, Policy Director at Just Liberty

Scott Henson: I wanted to start ... in your speech to the Texas Defender Service luncheon, you had talked about Texas having, really, I guess for the nation, sort of a surprising role in some of these innocence and forensics issues, and in particular about our Forensic Science Commission and our junk-science writ. I feel like Texans almost take these things for granted now. Can you give us, from your perspective, what that looks like from New York and from the national view?

Peter Neufeld: Sure. So, nationally, during the Obama administration, there was a major effort to look at particularly the forensic issues and think that we needed a national federal solution because, quite honestly, if you have some piece of evidence tested in the laboratory in Houston, or you have it in Buffalo, New York, that you get the same results just as they would in a clinical test. If you're sending in a sample of your kid's saliva to see whether or not she has some kind of disease, you'll get the same results from both labs, well the same thing should apply in crime laboratories.

And we got a lot of good things passed and introduced, a national forensic science commission that I was appointed to by the President, a standard setting body, efforts to review the way the FBI agents and other federal agents testify about forensic disciplines. It's got to be consistent with scientific principles, which became particularly relevant after they found that in 96% of the hair cases, FBI agents gave erroneous testimony, which exceeded the limits of science. All of this was moving along. On four different fronts.

Then of course, we had the last election. And Jeff Sessions became the Attorney General. Last April, he abolished the commission, he abolished the effort to standardize a language with outside input from statisticians and scientists. He ended and suspended the review of the way agents testify in all other forensic disciplines, and basically he brought to a screeching halt any effort by the federal government to enhance the quality of forensic science in the criminal justice system.

The responsibility fell much more to the states, and in that regard, Texas is leading the country in two ways. Texas has a Forensic Science Commission, which is outstanding, which has lots of stakeholders involved. They've sort of put petty differences aside and they all have one thing in common, they want to see only the best forensic disciplines used in cases where life and liberty are at stake. And it's remarkable. New York had a commission before Texas and it's an awful commission. It's a commission completely dominated by law enforcement and prosecutorial interest, and the truth ... and principles play a back seat. So Texas should be applauded for that.

Number two, Texas got the so called junk science statute passed, which allows people to bring a writ to throw out an old conviction, which was based on what we now know as discredited forensic science. It's very important because science moves much more rapidly than law. And so many of these disciplines are disreputable. So many of these disciplines have never been validated, have never been determined empirically reliable, but nevertheless, if a judge lets it in, that's the end of the review.

Scott Henson: And just real quick, to expand on that, some of these disciplines are some of the most common ones used in law enforcement, from matching ballistics to fingerprinting, all these things that are basically pattern recognition. Someone looking at it closely. Now that we don't have the National Forensic Commission, where do we go from here? Because we basically had ... after the 2009 National Academy of the Sciences report, really the flaws in forensics had sort of been exposed and the commission that you were a part of was created to say, "Okay, what do we do now that we know that all these things aren't really that scientific?" It seems like we've left it up in the air, but the path to figure out what needs to happen next has vanished, at least at the national level.

Peter Neufeld: Oh it's certainly vanished at the national level. You had a commission, which for the very first time, had, in addition to stakeholders, half a dozen world class scientists, who were not involved in forensics, but are leading Physicists, Chemists, Biologists, Neuroscientists, all playing an active role. And of course with the end of the commission, the Justice Department is determined that we no longer want independent scientists to give us any input.

So you've replaced a body of 35 people, including a half a dozen independent scientists, with a forensic tsar who was a Deputy District Attorney from the mid west, who's not a scientist, and he and other people inside justice will unilaterally make all the decisions about forensics for the future.

Scott Henson: Wow. Well, one guy huh? Who is this? What's his name? Yeah, it's okay if you don't know.

Peter Neufeld: Alright. I can't remember at the moment.

Scott Henson: That's fine. If it's going to be one guy, I thought we should know.

I wanted to ask you about a controversy that's been arising here in Texas, and I know in New York y'all have had a similar debate over it surrounding the medical examiner there, and that is around DNA mixture evidence. For years, when I was at the Innocence Project of Texas, I've said it a hundred times if I've said it once, DNA is the gold standard. We were out, we almost may have created this problem in some ways, by using that phrase over and over. It turns out DNA's a gold standard for one-on-one matches. For these mixtures it's a lot more questionable and dubious, and in fact we're now finding that a lot of the mathematics underlying how they had done these analyses had been just flawed and is having to be revisited.

Peter Neufeld: What you have to do, just to go back and give it some context, is all pattern and impression evidence, and that includes DNA, is only probative if you have reliable statistics to back it up. In other words, just to say that you have a DNA match, let's just say you only saw two markers out of 13, 'cause it was too small a sample or degraded, well the frequency of those two markers might be one in 20, but people think DNA it's one in a billion, one in a trillion. If it's 13 markers, it is one in a trillion. So it's the statistics that count.

The problem with all those other pattern and impression disciplines whether it's tool marks, ballistics, hair, bite marks is there were no statistics. There were no databases. We always had a database for a single sample DNA match. We know the frequency of that profile. The problem is, is if, let's say, you have a gun that's been handled by four or five people and you just see a bunch of DNA markers. How do you know which markers belong to which persons profile? You don't. Different companies have invented, through different algorithms, statistical models, which have a great deal of assumptions, to give law enforcement a sense of the probative value. The problem is that none of those algorithms have been properly validated, through any kind of independent entity.

Number two the algorithms themselves are confidential. They're trademarked, they're patented, and so defense attorneys, or even the court, can't see the underlying data. Finally, for the same reason that we needed the National Institute of Standards and Technology to look at these other forensic disciplines ... They announced three months ago that they're going to do a review of the validity of these different mixture algorithms. That'll be the very first time. They're going to start doing that by having each of the different companies apply their algorithms to a bunch of different specimens of mixtures that'll be created by the National Institute, by NIST. Then we'll see how well they'll do. They're going to be scored.

Until we have that kind of scoring then we, as a country, have no reason to feel comfortable and assured that the data is reliable. Before you make decisions about life and liberty, you better be damned sure.

Scott Henson: Right, and in the case of those companies there was one instance where two of them actually, in upstate New York, came to opposite conclusions from the same data. Then there's an instance here in Texas ... And this gets me to my final question here ... Over the summer we had a case here in Texas where several different mixture analyses methods were used on the same data. One of them excluded the suspect, one of them said no it's the suspect, and the judge just basically let everything in and said "Okay we're going to treat this like competing witnesses, just believe who you want to believe." Talk to me a little about how we can get judges to improve that gate keeping function in these areas that are so technically obscure that they really, none of us really, can understand the math behind it, much less the average sitting judge.

Peter Neufeld: Judges have a gate keeping duty in state court and federal court. They are expected to only permit evidence which is reliable to go before the jury. The problem is, is that the judges are scientifically illiterate. The lawyers who are advocating pro and con are scientifically illiterate. Ultimately the twelve jurors who weigh the evidence are scientifically illiterate. So the responsibility falls on the judges to see whether or not the proponent of the evidence has demonstrated the validity and reliability of whatever it is that's being proposed for the court.

The judges have just abdicated responsibility completely across the country. I don't know if it's because it's science they don't want to think about it, or if it's because they have a certain cognitive bias that just doesn't want to see a defendant acquitted and so will just let any of the evidence in. But there's no question that they are abdicating their legal and professional responsibility by not requiring the proponent of this evidence to demonstrate persuasively that the algorithms or whatever they want to say has been demonstrated to be reliable and valid. They haven't done it and the judges are shirking their responsibility.

Scott Henson: Alright. Well thank you very much for talking with me. I appreciate it.

Peter Neufeld: Okay, was that basically covering it?

Scott Henson: All good.

No comments:

Post a Comment