"There's always the chance of error," he said. "So, for instance, if a hair from the defendant is similar to one found at the crime scene, the issue is, what is the frequency of hairs that are similar in the general population? Ninety percent? Ten percent? One percent? The relevance of the evidence is based in part on how common it is. And that's a statistical issue."Spiegelman is not a Kennedy assassination buff but "says the Kennedy case is the ultimate example: If the science could be wrong in a case with intense public interest and with the government having all the resources it needed, then it certainly could -- and has often been -- wrong in much more low-profile cases."
Spiegelman's interest in statistical forensics was sparked in 2002, when, because of his expertise in statistics in chemistry, he was appointed to serve on a National Research Council (NRC) panel to study bullet lead evidence. During the meetings, he would step out to inject himself in the stomach with a high dose of interferon as part of a difficult chemotherapy treatment.
Spiegelman's doctor gave him a 50 percent chance of living. Instead of quitting the panel to focus on his treatment, Spiegelman immersed himself in the work with the stark realization that it could be his last professional act.
The treatment was a success, and while he overcame the threat to his life, his passion for statistical forensics remained.
In 2008, Spiegelman was a co-recipient of a prestigious national award for leading a team that published a paper finding that forensic evidence used to rule out the presence of a second shooter in President Kennedy's slaying was fundamentally flawed. He shared the American Statistical Association's 2008 Statistics in Chemistry Award with Simon Sheather, professor and head of the Texas A&M Department of Statistics, William D. James, a researcher with the Texas A&M Center for Chemical Characterization and Analysis (CCCA), and three other co-authors.
The paper showed that the bullet fragments involved in the assassination were not nearly as rare as previously thought, and that the likelihood that all the fragments didn't come from the same batch of bullets also was greater than previously thought.
Tuesday, October 30, 2012
A&M statistics prof helped debunk junk science on ballistics
"It takes a village to take down bad forensic science," said A&M statistician Cliff Spiegelman, who "was an ardent opponent of a method of forensic testing called
Comparative Bullet-Lead Analysis (CBLA), which partly through his work
the FBI discredited in 2007. The abandoned technique, which used
chemistry to link bullets from a crime scene to those owned by a
suspect, was first used following the 1963 assassination of President
John F. Kennedy." According to Texas A&M:
Labels:
ballistics,
Forensic Errors
Subscribe to:
Post Comments (Atom)
3 comments:
How is the Kennedy case wrong?
Why is everybody using Kennedy as an example of a messed up verdict -- and why is a forensic expert using it as such? It's the ultimate example of a case well done! It's only continually being debunked...by junk science.!
~L~
He didn't say the Kennedy case overall was wrong, and in fact the article says he takes no position. He said a particular ballistics test specifically first developed for use in that case (then later applied more widely) overstated the significance of its results. His is a very narrow critique - he's not disputing nor confirming the lone gunman theory, etc..
Isn't it amazing, and a bit discouraging, how some people cannot read a simple declarative sentence without getting it incorrectly?
Post a Comment