FBI Warns About Ransom Scams Involving Fake ‘Proof of Life’ Photos

retirement.media

(The Epoch Times)—Criminals are altering images of people obtained from social media or other public sites to create fake “proof of life” photos as part of virtual kidnapping for ransom scams, the FBI said in a public service announcement on Dec. 5.

According to the agency, criminal actors typically get in touch with their targets via text messages, claiming to have kidnapped a person close to them and demanding a ransom. The demands would often be accompanied by threats of violence.

The photo or video will, upon close inspection, reveal inaccuracies, with examples including “missing tattoos or scars and inaccurate body proportions,” the FBI said. The messages will have a sense of urgency—sent out using timed message features so family members do not have sufficient time to analyze the details.

Instead of reacting hastily, people who receive such communication should stop and think whether the kidnapper’s claims “make sense,” the notice said.

The agency advised people to always attempt to contact their loved ones before considering paying the ransom. A code word, known only within their close circle, will be crucial here.

Moreover, this type of fake image ransom scammer will use missing person information found online. The agency advised people to immediately take a screenshot or record any “proof of life” photos they receive.

AI-Powered Kidnapping

In a June 16 statement submitted to a House committee hearing, JB Branch, a technology accountability advocate at the nonprofit Public Citizen, highlighted the risks posed by artificial intelligence (AI) tools.

“From phishing emails generated in perfect English to deepfake videos impersonating family members, AI tools are being weaponized in ways that exploit trust, erode safety, and overwhelm law enforcement,” Branch said.

“In the era of AI, a single tool can now produce thousands of personalized phishing attacks or clone a victim’s voice in seconds. This proliferation has quickly outpaced previous technology, and the models can adapt quickly.”

Branch said local police forces have issued warnings about a rise in virtual kidnapping scams that make use of cloned voices. He highlighted a case involving a mother from Arizona who was nearly duped by such a scam.

This incident took place in 2023 when the mother received a phone call from her 15-year-old daughter, who was sobbing and pleading for help. A man’s voice on the phone suggested her daughter was kidnapped.

The mother quickly confirmed that her daughter was safe in the house, thus avoiding being duped.

The FBI had previously warned about the use of generative AI tools by scammers to commit large-scale fraud. “Generative AI reduces the time and effort criminals must expend to deceive their targets,” the agency said in a December 2024 alert.

AI-generated content not only fuels kidnapping scams but also a host of other financial crimes. Earlier this year, a group of lawmakers introduced the Preventing Deep Fake Scams Act that seeks to establish a task force to address bank scams fueled by AI, according to a Feb. 27 statement from the office of Rep. Brittany Pettersen (D-Colo.).

“Scammers are already using artificial intelligence to impersonate loved ones, steal personal information, and hack into bank accounts—putting Americans’ hard-earned money at risk,” Pettersen said.

“As AI continues to evolve, our policies must keep up in order to get ahead of scammers who want to use this technology to take advantage of people.”

The bill has been referred to the House Committee on Financial Services.