It's understandable that they feared their study would undermine the work or practitioners analyzing DNA mixture evidence. Check this out:
Researchers from the National Institute of Standards and Technology gave the same DNA mixture to about 105 American crime laboratories and three Canadian labs and asked them to compare it with DNA from three suspects from a mock bank robbery.
The first two suspects’ DNA was part of the mixture, and most labs correctly matched their DNA to the evidence. However, 74 labs wrongly said the sample included DNA evidence from the third suspect, an “innocent person” who should have been cleared of the hypothetical felony. ...
One shocking result from the new N.I.S.T. study is that labs analyzing the same evidence calculated vastly different statistics. Among the 108 crime labs in the study, the match statistics varied over 100 trillion-fold. That’s like the difference between soda change and the United States’ gross domestic product. These statistics are important because they are used by juries to consider whether a DNA match is just coincidence.In other words, more than 2/3 of crime laboratories analyzing the evidence would have falsely accused an innocent person.
Equally disturbing, the authors realized their findings were explosive, and so attempted to scuttle or downplay them. "If some of us had not complained publicly, it may not ever have been published," noted Greg Hampikian, the Boise State biologist whose Times op ed highlighted the study.
Moreover, "Neither the paper’s title nor the abstract mention the shocking findings. And the paper contains an amazing number of disclaimers," including one "apparently intended to block courtroom use."
Hampikian says there have been at least five DNA exonerations based on flawed DNA mixture evidence: "our Boise State University laboratory has re-examined a few select cases and already persuaded courts to overturn a conviction in New Mexico, two in Indiana and two in Montana. We have also helped identify a new suspect in a 23-year-old murder."
MORE: From Forbes, "Framed by our own cells: How DNA evidence imprisons the innocent."
- A reluctant scoop: Changing interpretations of DNA mixtures vex legal system
- Labs must correct wrong DNA mixture analyses, learn when to recognize 'crap'
- Resources on DNA mixtures
- The challenge of notifying defendants in large-scale forensic error cases
- Courts punt on forensics surrounding DNA mixtures
- News flash: 'Touch DNA' doesn't necessarily require touching
- DNA mixture SNAFU a mess but don't expect 'deluge' of innocence claims
17 comments:
I've never understood why Federal 'scientists' are assumed to be better than State, County, or City 'scientists'. Anyone could have designed, carried out, and published that study. Very few scientific papers come out of Texas crime labs.
This is not news in Texas. We have know about the NIST MIX studies for years. The Texas FSC has discussed them and staff has presented on them at conferences, to attorneys in training, etc. It is one of the reasons Texas has engaged in the massive undertaking of recalculating mixture cases. The rest of the country, not so sure....
@Lynn Garcia-
And this is why you let the APD DNA Unit rot from the inside for an additional 6 years after the FSC was alerted that lab management didn't know squat about DNA statistics.
Your inaction, and that of the FSC, cost the taxpayers upwards of $14 million dollars, an enormous delay in justice for several hundred defendants, an untold number of cases that never proceeded to prosecution, and at least 5 forensic analysts lost their jobs when the APD Unit was shut down in 2016.
But not all is lost since your cronies at the FSC, after shutting down the APD lab, received all the lucrative contracts to do the DNA work that was shuffled away from Austin. If no one has explained "Conflict of Interest" to you, Lynn, this is an example.
The academic institutes have know about this problem for years. The forensics community, not so sure they care...
@Lynn, I was at a lot of those FSC events and I don't recall anyone saying that 2/3 of the labs accused an innocent person in a controlled test. I reported on the critiques that were made, and they certainly presaged this possibility. But this study makes the full extent of the problem much more clear, IMO.
You wouldn't get much worse results from bite marks, this result seems really bad.
@grits, the Commission has discussed MIX13 and its implications (and continue to talk about it to this day) but there is no doubt the publication of the data obtained from both NIST MIX studies is long overdue. I first heard the data presented at an ABA annual prescriptions for criminal justice forensics conference in NY at Fordham Law School in 2014. The audience was primarily attorneys, mostly defense. And by the spring of 2015 Texas was embarking on its own internal review. NIST has presented the data via PowerPoint in various forums (you can google to see examples) but there is no replacement for publication.
One of the most respected DNA forensic scientists in the world Peter Gill said words to the effect that if you showed a mixture DNA to ten DNA forensic workers, you will probably end up with ten different answers. He said it some years ago, and it is quoted in Erin Murphy's book on DNA. One might have hoped that the situation would have improved with time.
"Regarding the [absence of] wearing of gloves when handling smears, while on-site, the 2008 ASCLD/LAB inspection team observed SWIFS serologists manipulating slides/smears without gloves. If as the complaint alleges, smears were routinely analyzed for DNA evidence where the possibility of contamination could become an issue, wearing gloves while handling smears would become standard laboratory practice [so, as such, SWIFS analysts don't wear gloves]....Smears [from a Sexual Assault Kit, SAK] are not processed in a way that is intended to preserve them for later DNA testing."
-Dr. Jeff Barnard, Chair, Texas Forensics Science Commission
Thank you for your work on the blog! You're doing a good job!
"PhD Assignment Help Service
Instant Assignment Help
Online Assignment Help
PhD topic selection help"
For once I agree with the ad bot
The headline for this story is ... Two thirds of crime labs falsely accused innocent people in blind study.
Are you sure? Crime labs falsely accused?
Crime labs in the United States don't accuse anyone of anything ... EVER.
It might be best in a criminal justice blog not to use a lie to trap people into reading an article.
Very disappointing.
If my only problem was criminal justice reform champions usung the full breadth of the English language to title an article that clearly articulates the ussue I'd be living good... Unfortunately I've got bigger problems, crime labs falsely identified suspects 66% of the time and the results were apparently concealed.
#priorities
How dare Forbes claim that our own cells framed and imprisoned innocent people! I didn't know that cells were so vindictive or judicial! If only our cells didn't have qualified immunity.
Lighten up, Gary.
And more than one lowly lab analyst has been accused/scapegoated by lab supervisors covering up for their own incompetence. See, Cecily Hamilton.
Crime Labs (and their oversight agencies) are shameful and without accountability. Right, Lynn?
For the more educated...
https://nvlpubs.nist.gov/nistpubs/ir/2018/NIST.IR.8225-draft.pdf
In order the avoid contamination, one has to change gloves in between each sample.
@Chris Halkides-
Exactly. But one has to be wearing gloves, first.
Not to mention that the evidence was collected and handled previously by doctors or nurses who WERE wearing gloves before the evidence arrived at the crime lab. No telling what kind of biological hazard was transferred from the trauma unit (for the sexual assault examination) to the evidence. Hepatitis? HIV?
Moreover, Exonerees Larry Fuller, James Waller, James Lee Woodard, and Rickey Dale Wyatt all had evidence handled, analyzed, and stored (long term, at room temperature) at Dr. Barnard's Crime Lab (SWIFS) in Dallas. The particular item of evidence that was re-analyzed, DNA tested, and key to their exonerations was...smears/slides from sexual assault kits.
So not only was Dr. Barnard's statement blatantly false (presented to cover-up incompetence), it has the more malicious intent to discourage the wrongfully convicted from seeking that type of evidence from his crime lab. If the evidence is not handled properly or stored correctly for later DNA testing, then why would a defendant waste time, money, and resources to find it and have it DNA tested? He/she remains in prison.
Rampant contamination inside that lab.
I am not sure that I can bring much clarity to this discussion of the NIST mixture study because it seems to have gone off the rails somewhat. But I will give it a shot.
The one sample in the study that people seem most concerned about – the MIX13 Case 5 sample, the one where 2/3 of labs “incorrectly” included a “person” (reference 5C) who was not an actual contributor – was an entirely synthetic sample that was purposely designed to test the range and limits of labs’ interpretation protocols when applied to extremely challenging mixture samples. It was not intended to mirror a typical real-life situation. It was a challenge sample that was intended to reflect an unusual, atypical situation that would be expected to be encountered rarely if at all in real life.
There were three principle elements to the design of this test sample.
Firstly, the sample was designed to be a 4-person (1:1:1:1) DNA mixture that looked like a 2-person mixture from the perspective of number of alleles per locus. In the data printouts that were provided to labs for interpretation, there were no more than 4 alleles at any locus. So, if a lab relied only on allele counting to determine the number of contributors, it would have incorrectly determined that the mixture was from 2 people. On this particular point, none of the labs was fooled. All labs correctly recognized that the profile was a mixture of DNA from more than 2 people because they correctly took peak height information into account when performing this part of the interpretation.
The second element of the design that was created to make the problem difficult was the profile of reference 5C. 5C was not an actual person but a synthetic DNA profile that was purposefully constructed so that it overlapped almost completely with the DNA profile of the MIX13 Case 5 mixture. In fact, there was only 1 difference at 1 locus (Penta E). So, a lab’s ability to correctly exclude Reference 5C depended upon the lab’s correct interpretation of the Penta E locus. By design, at all other loci besides Penta E the mixture profile included all of Reference 5C’s alleles, and the correct interpretation would have been to include Reference 5C as a possible contributor because there was no explicitly exclusionary information.
(cont'd)
(cont'd)
The third design element that was included to make the problem challenging was the data that was provided to labs. Two sets of data printouts were provided. The first set of data was produced using one commonly-used commercial test kit, the Promega Powerplex 16 HS kit. The second set of data was produced using a second commonly-used commercial test kit, the Applied Biosystems Identifiler Plus kit. Labs could use either data, and it was expected that a lab would use the data that corresponded to the test kit that it actually used in casework. The intentional design element that was put in to make the interpretation challenging was that the Penta E locus – the only locus that could be used to explicitly exclude Reference 5C as a contributor – is only in the Powerplex 16 HS; it is not included in the Identifiler Plus kit. So, a laboratory that used the Identifiler Plus data (which does not include Penta E) would find that at each locus in the mixture, the alleles of Reference 5C would be observed. So, the ability of a lab to correctly exclude 5C as a contributor to the mixture hinged upon using the Powerplex 16 HS data, and not using the Identifiler Plus data.
So, by design of the test would produce a large number of incorrect responses where 5C would be included as possible contributor. As it turned out, the design worked: 90 of the 108 participating laboratories used the Identifiler Plus data where it was not possible to exclude 5C as a contributor because the data did not include the Penta E locus. Only 18 labs used the Powerplex 16 HS data, where it was possible to correctly exclude 5C.
From the perspective of the goals of the study – to investigate the range and limits of labs’ interpretation protocols – the MIX13 Case 5 sample was a good sample. It showed very clearly that there was a range of interpretation protocols, and that some protocols were robust enough to handle this difficult sample while other protocols were not able to handle it.
But, as other commenters have noted, there is nothing new about observation that there is lab-to-lab variability in mixture interpretation. Some of the variability comes from some labs doing it wrong. But some of the variability also comes from the fact that there are multiple correct ways to do mixture interpretation. The study adds some details to the understanding, and it is useful in that regard. But there is no fundamentally new understanding that comes out of this study. I know personally that for the last 20 years I have been telling prosecutors and defense attorneys alike that different labs will interpret difficult mixtures very differently, and that getting multiple opinions may be of value in particular cases.
While some people may be late in coming to this party, the party has been going on for a long time.
Post a Comment