In the case out of the 12th Court of Appeals, crime-lab analysts originally declared that the sample positively identified the defendant. But they had used the flawed mathematical methods which had been debunked by experts in 2015. When they corrected the math, applying a "stochastic threshold" to the data, the defendant was no longer implicated. Then, prosecutors introduced the results from a proprietary, third-party DNA mixture test - a black-box method using statistical models to which defense experts don't have access - and that method once again pointed to the defendant.
To make matters more confusing: There are two competing companies with proprietary black-box methods for analyzing DNA mixtures, and in one case out of upstate New York they came to opposite conclusions. While DNA matches involving a single suspect are considered the gold standard of forensic evidence, when DNA from multiple people is mixed there can be much more uncertainty about the result.
A panel of forensic experts convened by the White House under the Obama Administration called DNA mixture analysis "subjective" and "not yet ... sufficiently and appropriately validated." According to an addendum to their report, "the available literature supported the validity and reliability of [probabilistic genotyping] for samples with three contributors where the person of interest comprises at least 20% of the sample," but not when there are more than three contributors or the person of interest comprises a smaller portion of the sample.
The case out of the 12th Court of Appeals, however, did not reach those questions. It was based on a sufficiency-of-the-evidence challenge, not on the admissibility of particular DNA mixture analysis methods, concluded Marzullo. So Texas still lacks a formal appellate green light for admitting these disputed methods, though in some cases such as this one, they may get into evidence without being challenged by the defense.
See a transcript of our conversation after the jump.
Transcript: "Top Stories" segment, August 2017 Reasonably Suspicious podcast from Just Liberty. Speakers: Just Liberty Policy Director Scott Henson and Texas Defender Service Executive Director Amanda Marzullo
Alright, next up. Two years ago the Texas Forensic Science Commission launched an investigation into forensics surrounding DNA mixtures, finding that most Texas agencies, including DPS, were using invalid mathematical techniques to analyze DNA mixture evidence. The method correcting the math, applying something called a stochastic threshold to the data, resulted in many instances where previous matches from DNA, all of the sudden, were called into question. Now, though, the 12th Court of Appeals out of Tyler has upheld a conviction where the prosecution used a third party analysis of DNA mixture evidence, even though its analytical methods are secret and can't be vetted by defense experts
In that case before the 12th Court, this black box method reversed a negative result from the stochastic method. So, DNA forensic analysts produced three different results in this case from the same data. There was the admittedly flawed method which first identified the defendant. Then, correcting the math of the first method meant the defendant couldn't be identified. And finally, the proprietary black box method fingered the defendant again.
So, Mandy, what's your take on all of this confusing array of forensic results?
Mandy Marzullo: Well, I think, at the end of the day, it's just that this case didn't resolve the issue. The issue on appeal was whether or not the state had proved its case or, specifically, was there sufficient evidence to convict the defendant of the crime and the 12th Court of Appeals affirmed the case, but it took a lot of different evidence into analysis. And when it came to the DNA evidence and sort of analogized those two conflicting account from witnesses and said that that's within the purview of the jury to look at the different tests that are in front of it and decide which one it would deem credible.
Scott Henson: So, the DNA testing company, STRMIX (pronounced “Star-Mix”), was actually touting this in the press as "Texas courts say this is admissible, this is okay". Well, it was admitted, but this wasn't really a ruling on admissibility.
Mandy Marzullo: Exactly. The defense did not appeal the ruling by the trial court and we don't even know if this was challenged at the trial level that admitted this evidence or admitted this testing into evidence. That's another issue and that's something that the defense really should be litigating, is whether STRMIX or either of these-
Scott Henson: Probabilistic genotyping.
Mandy Marzullo: -probabilistic genotyping is admissible. Is it hearsay? Are there confrontation clause issues that we need to take into account? And that's just not a piece of the puzzle here.
Scott Henson: The other piece that's strange to me about the probabilistic genotyping method is that, unlike the stochastic threshold and most other mathematical methods that you use in court, the results are not replicable. They use something called a Monte Carlo analysis where they do thousands and thousands of tests and hypotheticals and come up with a result based on that, but you actually get a different result every time, so an expert analyzing the same data won't come back with the same result. I'm not sure that makes it the best tool for courts to use.
Mandy Marzullo: It gives rise to another challenge, right? Because courts, as gate keepers, are only allowed or they're only supposed to admit evidence that bears on the question "Who did this? Is this indicative of anything". So, a testing protocol that issues a different result every time, there are reasons to believe that it isn't reliable and that's yet another challenge to this.
Scott Henson: And, in this case, the two main companies that are doing this type of testing called TrueAllele and STRMIX, which is the one used here in Texas. There was actually a case in upstate New York where the two methods came out with contradictory results and I think a lot of this is still up in the air. The president's advisory commission on science and technology under president Obama actually came out and said that while this was promising technology, it wasn't ripe yet and it hasn't been validated. So, I think maybe you're right. The admissibility challenge is something that, maybe, needs to happen soon.
Mandy Marzullo: We'll see.
11 comments:
Isn't it funny that the DNA profile happen to match that of the Defendant AFTER the Defendant's standard arrived at the lab? Go figure.
And what about the other 2 unknown DNA profiles also discovered on the sock? Isn't it possible that 1 of the 2 (or both) unknown DNA contributors were the actual perpetrators? Were the unknown DNA profiles run through CODIS? Or maybe these additional profiles were a result of sloppy lab contaminations? Were employee DNA profiles used for comparison to exclude lab personnel as contributors to the unknown DNA profiles? How did the investigators make the sudden jump from "no suspects" to "the Appellant did it"? The Prosecutor put on the blinders after they got their man.
"The new software can reliably consider all the available data."
Based on what empirical evidence? Was there a validation study conducted at the crime lab to qualify this statement? What if the evidence was tested a 5th, 6th, or 7th time? Would they get the same results? But again, they repeated the DNA testing over and over until they got the results they wanted, then stopped. My Magic 8-Ball can do the same things.
"[Victim] Massey was unable to provide much information about her assailant other than that he was a nice looking, clean cut black male, between five feet seven inches and six feet in height."
And yet...
"Appellant closely resembled the composite sketch created from the description Massey provided within one week of the assault."
So the victim couldn't remember exactly what the attacker looked like, but the sketch artist miraculously came up with a drawing that looks like the Defendant. Hmmm.
"We give deference to the jury’s responsibility to fairly resolve conflicts in the
testimony, to weigh the evidence, and to draw reasonable inferences from basic facts to ultimate facts...The inconsistencies as to Massey’s assailant’s height and clothing were all before the jury, who were the judges of the facts, the credibility of the witnesses, and the weight to be given their testimony."
So the jury is given this incredible responsibility, but were they provided with the time and resources to fulfill this responsibility? Unlikely.
There are some seriously lazy "gatekeepers" in this trial.
I am not surprised that the prosecutors in the case in question went shopping for a test that would include the defendant in the DNA on the sock. These companies earn their money by pleasing the prosecutors. Of course the test will be positive against the defendant.
That's probably unfair charge of "shopping," Steven, just bc of the timing - these new methods were just coming online while this case was being processed. When I was at the Innocence Project of Texas and tracking this stuff more closely back in 2015, we were told to expect these sorts of contradictory results, going both directions. However, it does present an outward appearance, and the jury has NO means to judge which forensic test is superior.
@5:38, there were apparently NO gatekeepers at the trial. Or rather, the judge appears to have seen the job as holding the gate open.
When the state's lab generates an analysis report, it is common for the defense to hire its own expert to review that work and offer interpretations/opinions that may be at odds with the state's expert. In this sort of standard situation, the court will not act in a gate-keeper role to exclude the interpretation of one side. The jury will hear both interpretations, and hear the arguments made by the experts for each interpretation. And then the jury will decide what are the facts - i.e., which of the interpretations is true.
I'm not seeing much difference between this standard situation, and the situation that occurred in the case under discussion. In the latter, the laboratory performed multiple analyses due to changing technologies. This seems like a reasonable expectation of its customer which would certainly want if possible to go to trial with results that reflected the most current methodology. The prosecution preferred the latest analysis. The defense preferred the penultimate analysis. Each side was presumably able to argue its side. And then the jury decided the facts.
The prosecution couldn't simply suppress the earlier reports. They had a responsibility under Michael Morton to make all of the reports available to the defense. And the court could prevent the defense from bringing the earlier report in. It was beneficial to the defense.
Correction: "And the court could not prevent..."
@10:33, I'd suggest the difference is that DNA is treated as gold standard forensics, while in truth these DNA mixture analyses are not - they are subjective interpretations where the science isn't NEARLY as settled as a one-on-one match or a two-person mixture where the victim's DNA is known, etc.. In fact, if there are more than three people in the mixture or the target doesn't make up more than 20 percent of the sample, the final method used hasn't been independently validated.
So the question becomes, until these issues are resolved, is it probative to be admitting DNA mixture evidence at all when the results are contradictory and confusing even for experts? Is it EVER a gatekeeper's role to CLOSE THE GATE when unsettled science contributes to more confusion than clarity for jurors? Or do we just let it all in and leave it to the jury to decide questions upon which the nation's top scientists can't agree, like what's the best approach to interpreting DNA mixtures?
This case didn't reach those issues of admissibility, even if clearly all sorts of stuff is being admitted. (The Rules of Evidence need a new motto in Texas: "Throw it all against the wall and see what sticks!")
On the upside, all this presages a lot of future work for our junk science writ to correct errors on the back end. No risk that thing will become an anachronism. You can watch the courts in cases like this one creating future demand for it.
GFB-
"That's probably unfair charge of "shopping"...these new methods were just coming online while this case was being processed..."
If it's not "shopping", then the Prosecution and Crime Lab should have no objection to re-analyzing all DNA data for, say, the past 20 years using the new methodology -- both adjudicated and non-adjudicated cases. If it's the next best thing to sliced bread, then put it to work. Surely there's DNA evidence sitting in the lab storage that didn't give a hit on CODIS from the first analysis. Maybe the second, third, or fourth go-around will give a usable profile. And there's probably a great number of wrongfully convicted individuals that would like to have their DNA evidence re-examined -- that is, if it exists, if they know it exists, if they know that the newer technology is more "reliable".
The problem with advances in forensic sciences is that the forensic analysts (rather, Lab Management) doesn't feel the need to re-examine old analyses because it's infallible, right? They're positive they got the correct answer the first time, right? But if a "new" technology is better/more reliable than the "old" technology, the crime labs should be obligated to reassess the previous unreliable DNA analyses. We want to get the correct perp, no?
However, I recall from a similar situation, fire science had progressed 12 years (from 1992 to 2004) but Willingham was still executed. So, I'm not placing any bets on the Prosecutors or crime labs to do the right (and scientific) thing anytime soon (unless it can result in a conviction).
Just so we are all on the same page here, laboratories can not simply retest evidence in cases post-conviction. After a conviction, the evidence "belongs" to both the prosecution and defense, and subsequent testing will require a court order, generally. Or at least the written agreement of the prosecutor and the defendant's attorney.
If ordered by the court or requested by prosecution and defense, laboratories will be quite willing without objection to reexamine the evidence in a case.
Actually, while they didn't retest evidence, DPS did do recalculations on many cases to apply the stochastic threshold to prior bad calculations. I left IPOT after that process started and don't know where it's currently at, but they did begin it.
The labs I am aware of are doing the recalculations as needed for trials, or as requested (by either prosecution or defense).
In 10 years we will still be hearing about cold cases that finally go to trial because the lab removed its head from it's ass long enough to re-calculate the stats properly. Undoubtedly, under public criticisms the labs will blame the lawyers and courts for not requesting the re-calc, and the lawyers will blame the labs for not taking the scientific initiative to get the the re-calcs done faster.
That's just too bad for rape victims. That's just too bad for the wrongly convicted.
Post a Comment