Thursday, October 29, 2015

Forensic fails, and other stories

As your correspondent prepares for today's Exoneration Commission hearing, here are a number of items which likely won't make it into individual blog posts but which merit Grits readers' attention:

No clear way to track down junky bite-mark cases
The Dallas News has a high-powered team - Brandi Grissom and Jennifer Emily - covering the Steven Chaney bite mark case and the Forensic Science Commission review of bite mark cases. They reported Monday that "Tracking down dozens — maybe hundreds — of other potentially innocent victims of junk science won’t be ... easy. There is no central repository of cases in which bite-mark testimony was key. There’s no database of dentists who testified about bite marks. And the cases are mostly decades old, and experts, defense lawyers and prosecutors have moved on or died."

Bad Ballistics?
The Texas Court of Criminal Appeals ordered an examination into overstated ballistics testimony from an expert in Arthur Brown, Jr.'s 22-year old capital murder trial, in which he was prosecuted along with an already-executed accomplice for a quadruple killing in a botched drug transaction. Reported the Houston Chronicle:
Brown was scheduled for execution in October 2013 but received a stay to allow for forensic testing of evidence. An accomplice, Marion Dudley, 33, was executed in 2006.

On Wednesday, the appeals court judges acted on Brown's November 2014 appeal in which he asserted that Houston Police Department ballistics expert C.E. Anderson "testified falsely or in a materially misleading manner" in his case. Judges held that the claim met state standards warranting review.
See the court's order.

Paging Antonin Scalia: On the right to confront the boss of your accuser
The Court of Criminal Appeals also ruled that the requirements of the Sixth Amendment's "Confrontation Clause" may be met for "batch DNA testing" by a crime lab supervisor testifying based on computer printouts instead of the lab workers who conducted the analysis. Judge David Newell is of course correct that neither the CCA nor SCOTUS have ever "squarely answered this question." SCOTUS said that sworn affidavits are insufficient, but not whether a supervisor can testify based on the work of her subordinates. But one certainly wonders what Antonin Scalia might say about it. It's hard for this non-attorney to understand why relying on data generated by non-testifying lab analysts is different from relying on an affidavit to which they did not testify.

Are black-box calculations problematic for DNA mixtures?
Next year, the Department of Public Safety and most other Texas labs will shift to "probabilistic genotyping" to analyze DNA mixture evidence, a method which supposedly is superior even to the adjusted calculations which are currently available. However, that method relies on proprietary programs with black-box systems for which the makers will not release their code, similar to the situation surrounding proprietary breathalyzer algorithms which accuse defendants based on computer code which their attorneys and the court cannot see nor evaluate. See good discussions of the topic from Slate and Ars Technica. Moreover, as Grits reported earlier, because of the nature of the calculations, results based on probabilistic genotyping will be different every time - they're not replicable, in addition to not being transparent. So, while the fact that the DWI equipment is still in use makes me think Texas courts would ultimately find a way to allow this sort of proprietary opacity, Grits continues to wonder if probabilistic genotyping is the best tool for the job when it comes to providing courtroom testimony, given that there are other methods available where the calculations are both transparent and replicable.

Finding housing with a felony record
There's a good article in the Houston Chronicle on the struggles poor people have renting an apartment with a felony record. Houston is closing a dangerous low-income apartment run by a slumlord who they've sued over "atrocious living conditions." Reporter Emma Henchliffe decided to pay attention to what happened to the ousted tenants, finding that the ones with a felony record had a terrible time locating new places that would take them. "Individual owners have the right to accept and reject applications as they choose, but the lack of alternatives for tenants who do not meet owners' standards causes many former offenders to end up at places like Crestmont," she wrote. "Some apartments where [one tenant] applied took her application fee and never got back to her." When you consider the volume of felons Texas produces - we release more than 70,000 prisoners from TDCJ every year - this is a much more important issue than one would think from the amount of coverage it receives. I was glad to see this article.

No surprise: 'Stingrays' really do intercept content
Turns out, contrary to public assertions by law enforcement, Stingrays are indeed able to intercept calls as well as cell-phone metadata, it's now been proven. This was obvious to anyone who thought about it: The devices trick your phone into routing through a fake cell phone tower; clearly they were intercepting the whole call, not just metadata. And experts have been telling us this for a while. Still, nice to see it confirmed, it's one less thing to argue about.

NOTE: A brief item about a murder case in Denton was removed from this roundup after a commenter informed me that the underlying news article improperly attributed a court action to DNA mixture protocols when the real issue was unrelated. Grits apologizes for the error and will perhaps revisit the topic when more accurate information is available.


  1. Related, also from The Slate:

  2. The Robert A. Otteson case in Denton is not a DNA or DNA mixture case. This is local media writing a story without knowing the facts. The evidence deals with finger prints.

  3. Thanks, David. I'm going to remove that one if the underlying media report is fundamentally wrong. Please email me actual details if you've got them at

  4. The stingray story must be wrong. No LEO who swore an oath to protect and defend the Constitution would ever be a part of anything that violates the 4th Amendment in the SLIGHTEST.

  5. Regarding the "probabilistic genotying" software, Grits states, "...because of the nature of the calculations, results based on probabilistic genotyping will be different every time - they're not replicable." The results of the calculations are replicable, with an a degree of precision (uncertainty or margin of error) that can be characterized, understood, and comminicated. In this sense, the systems are similar to any other measuring process utilizing a measuring device, because all measurement processes have an uncertainty component. Even a measurement with a device as mundane as a metal tape rule has an associated uncertainty value, whis generally ignored when working around the house but not when woring in a lab. When the measurement process is a multistep process, with each of the steps having an individual uncertainty, the the final overall uncertainty becomes the sum of all of the individual step uncertainties.

  6. 5:59 again.

    Re the computer code issue and transparency issue.

    I suspect that this is more of a legal strategy matter, rather than a true interest in examining and evaluating the computer code that underlies the evidence. There is really no more of a black box issue with these systems than there is with the proprietary software that runs the cameras used to collect and store crime scene photographs. Or with the software used to make and store videos of interrogations. The acceptance of the reliability of these types of software-based information doesn't come from examination of the computer code, but from everyone's end-user evaluation of the product of the process. For instance, I don't need to evaluate the code of my camera's software to know that color rendition in a photograph doesn't precisely match what I see with my naked eye.

  7. Grits:

    To get fancy about it, probabilistic MCMC methods are processes to directly measure a posterior posterior probability distribution space. The analogy to more familiar types of measurement process is appropriate and relevant. There are stochastic (i.e., random) components to every measurement process. MCMC methods are not special in that regard.

    And transparency and reproducibility alone do not a reliable process make. Tying a woman up and throwing her into a pond to determine if she is a witch by whether she floats or not is both a transparent and reproducible process.

    And I would think that you of all people would be highly supportive of this approach, since it will reduce the likelihood that innocent people will be convicted. The great value of the approach is that it more fully utilized the available information, which reduces the false positive error rate.

  8. @6:51, the different numbers each time aren't due to measurement error at the margins, it's because probabilistic genotyping relies on the Markov Chain Monte Carlo method which uses a random number generator as part of its analysis. So the calculations are in fact different each time, and not at all because of imprecise measurement. Your comments are non-responsive to the actual mathematical questions at hand.

    Anyway, the question isn't whether the math is valid, it's what is the right tool to use in court, given that there are other options which are transparent and generate replicable results - not within a range, but run-a-calculation-and-get-the-same-number.

    For me, it's hardly a legal strategy issue - the failures of DNA mixture analysis have created a king-size mess. It's worth thinking through the reaction so the supposed fix doesn't just create more or different problems. Courts have different needs and requirements for the math they use than do Bayesian statisticians.

  9. Sorry, 7:47, I edited a typo in my comment and yours popped up while I was doing it, so it now comes after. Shoulda just left it.

    Regardless, I agree that "transparency and reproducibility alone do not a reliable process make." However perhaps you'll agree that a) the courts value those qualities and b) the alternative (CPI) is perhaps just a tad more valid than "Tying a woman up and throwing her into a pond to determine if she is a witch."

    And as for your last comment, I'm highly supportive of not using inaccurate math. I'm told the CPI method is valid when a stochastic threshold is applied. Probabilistic genotyping uses more data but has the black-box/replicability issues. Basically these are slightly different tools and, as the carpenter says, one chooses the right tool for the job. The courts will decide if and how much they care about transparency and replicability, but IMO those are issues which have been glossed over in public discussions. Mainly, I want them to pick a method that's stable and defensible so we don't have some similar SNAFU 5 years down the line.

  10. Also, the non-replicability issue as I understand it is not because of measurement error, it's because of the Random Number Generator. Saying it's analogous doesn't make it so. When I measure a piece of wood, I may not get the cut right down to the nanometer, but I don't inject random numbers into my calculations. Admittedly, I don't pretend to understand all the high-end Bayesian math behind the Markov Chain Monte Carlo method, but I know enough to tell you're vastly oversimplifying on that front.

  11. "Finding housing with a felony record" :

    BAN THE BOX !!!!!!!!

  12. 1) Good luck getting a lab analyst to explain MCMC statistics to a jury. I know lab analysts that couldn't remember to put on gloves before handling bloody evidence, let alone understanding and calculating statistical probabilities. Describing the probabilities behind a coin toss would be mind bending for these folk.
    2) Good luck getting a Prosecutor during summation to communicate accurately the statistics performed. Even if the prosecutor was describing the results based on a coin toss, "because there's a 99.999% chance that 'tails' will occur, the defendant is guilty."
    3) Let's not forget that the FBI's PopStat software was quasi-transparent. This "black box" contained incorrect probabilities due mostly to human error.
    Maybe Dr. Budowle is brilliant in this field, and that he believes the erroneous "discrepancies require acknowledgment but are unlikely to materially affect any assessment of evidential value."

    The problem, however, is of an ethical nature. Why did it take 16 years to publish an erratum (or rather, admit their mistakes). The "black box" could have been assessed, should have been assessed, but wasn't.

    The sadly comical and ironic statement from the erratum comes after the references...

    Note Added in Proof
    After this Erratum was published online, errors in the text and in
    Tables 1 and 2 were identified. They have now been corrected.

    So let's dig into the "black boxes" now instead of waiting 16 years. The creators of this software knew they were going to be presented in court and would be subject to scrutiny (not unlike a cross-examined expert witness), so this type of examination should have been expected.

  13. "..Regarding the wearing of gloves when handling smears: While on-site, the 2008 ASCLD/LAB inspection team observed SWIFS serologists manipulating slides/smears without gloves. If as the complaint alleges, smears were routinely analyzed for DNA evidence where the possibility of contamination could become an issue, wearing gloves while handling smears would become standard laboratory practice. Such was not found to be the case in interviews with SWIFS forensic biology management [aka. Dr. Jeffrey Barnard, Director Dallas County Crime Lab and Texas Forensic Science Commissioner]..."

    Untruthful statements presented by ASCLD/LAB, 2010.

    1) Lab analysts never looked for contamination, therefore didn't find any.
    2) Smears have been used for DNA analysis, exonerating at least 3 wrongfully convicted in Dallas.

    No penalties for lying.

  14. FWIW, 2:16, when Budowle said changes are "unlikely to materially affect any assessment of evidential value," he was talking about the data entry errors they corrected, he wouldn't and doesn't say that about lab protocols that didn't use a stochastic threshold for CPI.

  15. The statement from Budowle sounds like something a Prosecutor would say in order to divert attention away from the Brady violation. Or it's something that the Corsicana fire inspector would state in 2004 after all the corrected scientific findings negated the erroneous fire evidence used to convict Willingham in 1993.

    It's not that Dr. Budowle is not mathematically correct (maybe...can one mathematically calculate evidential value?), but it's the fact that it took 16 years for the FBI to address the error. If it truly is "unlikely", why not make the correction 16 years ago, or at least 10 years ago when Dr. Dan Krane noticed the error and notified the FBI.

    As an analogy, wouldn't you be bit irate if your bank screwed up the balance on your savings account 16 years ago, knew it was screwed up for the past 10 years, and just now got around to telling you? And, maybe they'll look into correcting it. Who cares if you've been shorted a few thousand dollars from your savings account? The mistake is "unlikely" to affect your balance, maybe.

    To downplay forensic error, big or small, is disingenuous (but a mainstay for the TFSC) and a disservice to criminal justice.

  16. 10:48 appears to be conflating the NDIS forensic profile database with the FBI's population frequency database. These are two separate and distinct databases.

  17. @anony 4:02-

    "...Krane, who identified errors 10 years ago in the DNA profiles the FBI analyzed to generate the population statistics data, called the consequences of the disclosure appalling, saying the data has been used in tens of thousands or hundreds of thousands of cases worldwide in the past 15 years. He said when he flagged the problems a decade ago, the FBI downplayed his findings..."


    In a different story, about a different database,

    "...The Federal Bureau of Investigation, in a review of a national DNA database, has identified nearly 170 profiles that probably contain errors, some the result of handwriting mistakes or interpretation errors by lab technicians, while New York State authorities have turned up mistakes in DNA profiles in New York’s database..."


    They ARE two separate and distinct databases. Both are controlled by the FBI. Both are used by hundreds of crime labs across the country. Both have errors.

    Thanks for telling us something we already know. You should work for the Huffington Post.