crime scene

The largest-ever black-box study on the accuracy of bloodstain pattern analysis (BPA), a widely used forensic technique, has found concerning error rates and disagreement among analysts.

The study, published in the August 2021 volume of Forensic Science International, is the most rigorous attempt so far to measure the accuracy and reproducibility of BPA, in which analysts interpret bloodstains at crime scenes.

"Our results show that conclusions were often erroneous and often contradicted other analysts," the report found. "Both semantic differences and contradictory interpretations contributed to errors and disagreements, which could have serious implications if they occurred in casework."

For the study, researchers collected 192 examples of blood spatters from controlled samples and actual casework and presented pictures of them to 75 practicing BPA analysts for classification.

"On samples with known causes, 11.2 percent of responses were erroneous," the study found. "The results show limited reproducibility of conclusions: 7.8 percent of responses contradicted other analysts."

BPA is one of several forensic disciplines that has come under increased scrutiny over the last decade, along with other methods—such as bite mark, hair, and shoe print analysis—that do not have established error rates and rely on pattern matching or subjective interpretation. Yet they are widely accepted in courtrooms across the country, despite concerns over reliability and a number of wrongful convictions.

In 2018, ProPublica published a series of investigative stories on the history and use of BPA. It found a disturbing amount of questionable casework, exonerations, and investigators with no more than 40 hours of BPA training testifying in court. (For the story, the reporter herself went through a 40-hour class on BPA.) It also noted that there were few scientific studies on the reliability of the methods.

2009 National Academy of Sciences study was, at the time, the most extensive done in the U.S. on the scientific validity of several commonly used forensic techniques, and it was critical of blood spatter analysis.

"In general, the opinions of bloodstain pattern analysis are more subjective than scientific," the study said. "Extra care must be given to the way in which the analyses are presented in court. The uncertainties associated with bloodstain pattern analysis are enormous."

The Justice Department came under pressure to improve forensic standards after the FBI admitted in 2015 that two dozen examiners in one of its hair analysis labs had given flawed testimony in hundreds of cases. In those cases, 32 defendants were sentenced to death; 14 were eventually executed or died in prison.

A 2016 report by the President's Council of Advisors on Science and Technology (PCAST) found that reviews of several commonly used forensic methods "have revealed a dismaying frequency of instances of use of forensic evidence that do not pass an objective test of scientific validity."

In the case of bite mark evidence, for example, the report stated that "available scientific evidence strongly suggests that examiners not only cannot identify the source of bite mark with reasonable accuracy, they cannot even consistently agree on whether an injury is a human bite mark."

However, both the Obama and Trump administrations resisted calls to improve forensic standards. The Obama Justice Department rejectedPCAST's recommendations to require expert witnesses to disclose error rates in their testimony and, where methods haven't been scientifically verified, not use them at all. 

In 2017, Attorney General Jeff Sessions disbanded the National Commission on Forensic Science, an independent panel of scientists, law enforcement, judges, and defense attorneys created by the Obama administration in 2013 to review the reliability of forensic science used in trials.

The authors of the most recent study on BPA noted that it differed from actual casework, where analysts have additional context from the crime scene, and that the majority of respondents almost always arrived at the correct conclusions, suggesting multiple independent verifications may help. However, the authors recommend standardizing BPA methodology and terminology to reduce contradictory interpretations.

"Although the error and reproducibility rates measured here should not be taken to be precise measures of operational error rates," the study said, "their magnitude and the fact that they corroborate the rates measured in previous studies, should raise concerns in the BPA community."

They should raise concerns for defendants, too.

The study was conducted by researchers at the private Virginia-based firm Noblis, the Kansas City Police Department Crime Laboratory, and Indiana University.

C.J. CIARAMELLA

Adam Lee Nemann
Connect with me
Trial and Defense Attorney, Adjunct Professor of Law at Capital University, founder of Nemann Law Offices
Post A Comment