The jaw-dropping criminal charges against Raffaela Spone in March quickly dominated international headlines: The suburban Pennsylvania cheer mom was alleged to have created a “deepfake” video to secretly attack the other girls on her daughter’s cheerleading squad.
“While investigators originally believed at least one video showed evidence of the use of so-called Deep Fake face replacement technology, police are at this point unable to confirm the video evidence was falsified,” Bucks County District Attorney Matthew Weintraub said via his office in a statement Friday.
The revelation deeply undermines a case that prosecutors and the press had held up as a showcase for the fearsome power of artificial intelligence — the disturbing way it could allow even a 50-year-old parent to create a video showing someone doing something they actually hadn’t done.
The case has proved personally devastating for Spone, who found herself at the center of a media spectacle that left her feeling distraught and vilified. She has received death threats online and been subjected to “ridicule, embarrassment and harassment” at home, her attorney Robert J. Birch said.
“Her reputation right now is less than mud,” Birch told The Washington Post. “They have ruined her life, there’s no question about it. … On Twitter, they have already convicted her. She’s always going to be labeled the ‘deepfake mom’ or … a ‘criminal mastermind.’ How do you dig out of that?”
“I went in the car and started crying and was like, ‘That’s not me on video,’” Hime said in the segment. “I thought if I said it that no one would believe me, because obviously, there’s proof, it’s a video. But the video was obviously manipulated.”
But while the ABC news segment clearly labeled the video “DEEP FAKE,” synthetic-media researchers were much more suspicious. The footage carried none of the signatures that can give away traditional AI-generated videos, such as artifacts around the eyes or face, and it included photorealistic details that would be extremely difficult to fake, including the hazy cloud of vapor that Hime appeared to exhale.
The Post had been asking prosecutors since March to provide it with any video or images entered as evidence in the case, but they had always declined to do so. As recently as Tuesday, Weintraub had told The Post that prosecutors “still have some evidence to gather from outside sources, and are trying to make arrangements for the defendant’s expert to view and analyze the phone.”
Three digital-forensic experts to whom The Post showed the broadcast video in recent weeks said it was “highly unlikely” to be a deepfake and appeared “blatantly authentic.” But they also noted that the poor video quality and the lack of other evidence made it impossible to draw any firm conclusions.
They did, however, raise one possible explanation: With deepfakes increasingly capturing public attention, they are also more likely to become scapegoats for real offenses caught on camera. The law professors Danielle Citron and Robert Chesney coined the phrase “liar’s dividend” in 2018 to describe this concept, saying deepfakes could make it easier for people “to avoid accountability for things that are in fact true.”
“This may very well be an interesting case of ‘That’s not me, that’s a deepfake,’” Hany Farid, a University of California at Berkeley professor who specializes in visual analysis, told The Post last month.
Parents named in the case have not responded to emails or phone messages seeking comment in recent weeks. The Hilltown Township police officer who investigated — writing in a criminal complaint that the “doctored” video had been found “to be the work of a program that is or is similar to ‘Deep Fakes’” — has not been made available for comment and did not testify at the hearing Friday.
Three girls on Spone’s daughter’s cheerleading squad, the Victory Vipers in Doylestown, Pa., told police last year that they had received harassing text messages and digitally altered photos of them from an unknown number, officers said in a March criminal complaint.
But Birch, Spone’s attorney, said Spone never altered any images and instead sent messages to the parents out of concern for the girls on her daughter’s team. Some of the messages pointed out potentially sensitive material the girls had posted to their own social media accounts.
Birch has called the case a “sloppy, sloppy police investigation,” arguing that Spone did not have the technical knowledge or ability to craft convincing deepfakes, and that her main piece of computing hardware was an old iPhone 8.
Neither Spone nor her attorney had been shown any of the alleged deepfake videos at the heart of the case, said Birch, who also filed a motion last month calling for Spone’s phone to be returned. In interviews with The Post, he loudly questioned the legal argument that had led his client to face criminal charges.
Prosecutors said Friday they still intend to argue that Spone harassed the girls and attempted to sully their reputation by sending images of them from an anonymous number — one of which was said to have been “manipulated” to make one girl “falsely appear to be unclothed in a public place.” A judge on Friday ordered the case to trial.
Weintraub’s office said in its statement Friday that “while the original assessment that some of the evidence was created by media manipulation may not end up being accurate, a neutral finder of fact will ultimately have the opportunity to determine if the evidence in this case shows Ms. Spone sent photos, videos and texts designed to harass three innocent children.”
The criminal case and media spectacle may have fueled misconceptions around how easy, lifelike or prevalent such deepfake attacks really are, said Henry Ajder, a U.K.-based expert adviser on deepfakes. It “showed a real pile-on for a juicy story without critical reflection of what constituted that claim.”
But they also probably spotlighted a growing threat in the years ahead: how an erosion of trust in video could lead to more fakes, more false accusations and more “massive consequences” for those accused.
“From a basic analysis, it was very obvious,” Ajder added. “But this case is a portent of things to come as deepfakes become more sophisticated and more realistic. It may not be so easy to make that distinction going forward. And in that situation, your everyday police officer isn’t going to be able to tell the difference.”
By Drew Harwell