Florida mom warns of AI-powered FaceTime prank that cloned her daughter’s image and voice

When Erika Anderson answered the call from her 17-year-old daughter, it sounded and looked just like her, down to her school shirt and the way she spoke.

JACKSONVILLE, Fla. — A Jacksonville mother is warning parents about a terrifying AI-powered prank that used her daughter’s voice and image in a spoofed FaceTime call.

Erika Anderson, a lifelong Jacksonville resident, said she was working at home last month when a FaceTime call popped up from her 17‑year‑old daughter’s contact.

“It’s her contact, and I’m like, what’s she calling me for because I know she’s in school,” Anderson said.

When she picked up, the voice sounded and looked just like her daughter, down to her school shirt and the way she spoke.

“She was like, ‘Hey mommy, open the door,’ and I’m like, ‘Open the door? What do you mean, open the door?'” Anderson said. “She was like, ‘I’m sick, I don’t feel well.'”

Suspicious, Anderson asked what they had had for dinner the night before.

“My mommy’s brain kicked into play like something is not right,” she said. “I started asking her all the questions that she would know the answer to, and the person was like, ‘Oh my God, mommy, I don’t know the answer, I don’t know, just open the door, oh my God.’”

Anderson hung up, checked her home security cameras and saw no one at the door.

She called her daughter’s school, where staff confirmed the teen was in class taking a test and had her phone locked away.

“I’m scared,” Anderson said. “I go get them from school.”

She later learned the call was a prank using an AI‑driven app that cloned her daughter’s voice and image.

Police were contacted, and the family of the teenager behind the prank apologized. Anderson chose not to press charges but has since tightened security at home.

“We have now made code words,” she said. “I changed the alarm in my house. I changed the camera angles. As a matter of fact, we added two more cameras.”

Anderson posted the story on TikTok. It has racked up nearly 20 million views, with thousands of viewers saying they, or close relatives, have experienced the same AI‑powered FaceTime prank.

“All these people start flooding my comment section saying this happened to me, this happened to my mom, this happened to my sister, this just happened to my cousin, this is real, and I’m like, but how?” she said.

Law enforcement said these kinds of scams are becoming more common as generative AI tools become easier to access.

The Jacksonville Sheriff’s Office said it is seeing a surge in reported cybercrimes, including AI‑facilitated spoofs, and has assigned detectives in its Economic Crimes Unit to investigate such cases.

The FBI echoed that message, advising the public to verify the identity of anyone calling or messaging them, even if the voice and image appear genuine. Officials urge people to hang up, independently call the person back on a known number, and create a family code word that is never shared online.

The FBI also recommends looking for subtle imperfections in video or images, such as distorted hands or feet, unrealistic teeth or eyes, inaccurate shadows or watermarks, and for voices that sound just a little off.

Florida Gov. Ron DeSantis spoke at a news conference Thursday, warning that AI‑generated content can blur the line between real and fake.

“Do we live in a real reality, or do we live in a fake reality?” he said. “It’s hard to tell the difference, and they can do it in ways that that could potentially be harmful to you.”

Anderson said she wants anyone thinking about an AI prank like this to stop.

“To go around and plant fear, fake fear into people, that is not called for,” she said. “Please don’t because you don’t understand the damage that you do mentally.”



Source link