Imagine. The New York Times reported a few days ago that the FBI erroneously identified criminals 96% of the time based on probability assessments using forensic hair samples (up until 2000). Sometimes the hair wasn’t even human, it might have come from a dog, a cat or a fur coat! I posted on the unreliability of hair forensics a few years ago. The forensics of bite marks aren’t much better.[i] John Byrd, forensic analyst and reader of this blog had commented at the time that: “At the root of it is the tradition of hiring non-scientists into the technical positions in the labs. They tended to be agents. That explains a lot about misinterpretation of the weight of evidence and the inability to explain the import of lab findings in court.” DNA is supposed to cure all that. So is it? I don’t know, but apparently the FBI “has agreed to provide free DNA testing where there is either a court order or a request for testing by the prosecution.”[ii] See the FBI report.
Here’s the op-ed from the New York Times from April 27, 2015:
The odds were 10-million-to-one, the prosecution said, against hair strands found at the scene of a 1978 murder of a Washington, D.C., taxi driver belonging to anyone but Santae Tribble. Based largely on this compelling statistic, drawn from the testimony of an analyst with the Federal Bureau of Investigation, Mr. Tribble, 17 at the time, was convicted of the crime and sentenced to 20 years to life.
But the hair did not belong to Mr. Tribble. Some of it wasn’t even human. In 2012, a judge vacated Mr. Tribble’s conviction and dismissed the charges against him when DNA testing showed there was no match between the hair samples, and that one strand had come from a dog.
Mr. Tribble’s case — along with the exoneration of two other men who served decades in prison based on faulty hair-sample analysis — spurred the F.B.I. to conduct a sweeping post-conviction review of 2,500 cases in which its hair-sample lab reported a match.
The preliminary results of that review, which Spencer Hsu of The Washington Post reported last week, are breathtaking: out of 268 criminal cases nationwide between 1985 and 1999, the bureau’s “elite” forensic hair-sample analysts testified wrongly in favor of the prosecution, in 257, or 96 percent of the time. Thirty-two defendants in those cases were sentenced to death; 14 have since been executed or died in prison.
The agency is continuing to review the rest of the cases from the pre-DNA era. The Justice Department is working with the Innocence Project and the National Association of Criminal Defense Lawyers to notify the defendants in those cases that they may have grounds for an appeal. It cannot, however, address the thousands of additional cases where potentially flawed testimony came from one of the 500 to 1,000 state or local analysts trained by the F.B.I. Peter Neufeld, co-founder of the Innocence Project, rightly called this a “complete disaster.”
Law enforcement agencies have long known of the dubious value of hair-sample analysis. A 2009 report by the National Research Council found “no scientific support” and “no uniform standards” for the method’s use in positively identifying a suspect. At best, hair-sample analysis can rule out a suspect, or identify a wide class of people with similar characteristics.
Yet until DNA testing became commonplace in the late 1990s, forensic analysts testified confidently to the near-certainty of matches between hair found at crime scenes and samples taken from defendants. The F.B.I. did not even have written standards on how analysts should testify about their findings until 2012.
If the early results of the new review are any indication, there are many more wrongful convictions waiting to be discovered. In the District of Columbia alone, three of the seven men whose convictions involved erroneous hair-sample testimony have already been exonerated. That should be no surprise, since it is hard to imagine that juries would discount the testimony of a F.B.I. forensics expert with years of experience.
The difficulty now is in identifying and addressing the remaining cases quickly and thoroughly. Most of the convictions are decades old; witnesses, memories, and even evidence may be long gone.
And courts have only made the problem worse by purporting to be scientifically literate, and allowing in all kinds of evidence that would not make it within shouting distance of a peer-reviewed journal. Of the 329 exonerations based on DNA testing since 1989, more than one-quarter involved convictions based on “pattern” evidence — like hair samples, ballistics, tire tracks, and bite marks — testified to by so-called experts.
While the F.B.I. is finally treating this fiasco with the seriousness it deserves, that offers little comfort to the men and women who have spent decades behind bars based on junk science.
Of course it doesn’t mean the person convicted was innocent, only that the forensic analysis of hair was wrong. It would be interesting to know how many cases turned on quite strong other evidence, and so the hair analysis wasn’t pivotal. There’s an interesting graphic in the report that has a little more information.
Addition: I found an article in the Guardian that gives some cases that turned on hair analysis (notice the big picture of a cat), but I don’t know if they’ve been exonerated (surely too soon to know.) “Thirty years in jail for a single hair: the FBI’s ‘mass disaster of false conviction”. Please send me links of relevance.
[i]From the Washington Post (“A Brief History of Forensics”): “Bite mark matching is an excellent example. As part of its credentialing exam, the American Board of Forensic Odontology asks test takers to match bite marks to the dental mold of the person who created them. But it’s notable that the test doesn’t require actually matching the correct bite marks to the correct biter. Instead, test takers are evaluated on their analysis.”
[ii] Oh no, I read that a “National accreditation board suspends all DNA testing at D.C. crime lab”. They failed an audit.
Another article I came across: Pseudoscience in the witness box
Filed under: evidence-based policy, junk science, PhilStat Law, Statistics