FBI warns of increase in deepfake sextortion reports

Criminals are using deep fake technology to make fake explicit photos and extort people for money and worse.
Published: Jun. 9, 2023 at 1:08 PM MST|Updated: Jun. 9, 2023 at 8:41 PM MST
Email This Link
Share on Pinterest
Share on LinkedIn

PHOENIX (3TV/CBS 5) — The FBI is warning of an increase in reports of sextortion with fake photos and videos created by artificial intelligence. The so-called “deepfakes” are becoming easier to create as technology rapidly improves.

According to the FBI’s alert, criminals are taking benign photos and videos from social media, video chats and dating sites, and then using those photos and videos to create explicit content. Then, the criminals threaten to release the fake photos and videos unless their victims pay up. In some cases, criminals will also demand real sexually explicit content in exchange for not sharing the AI images and videos.

“There’s no limits to where criminals will go to exploit individuals and make a profit. That’s the bottom line,” said Tim Roemer, a cyber security expert and the former director of Arizona’s Department of Homeland Security. “It’s unfortunate, but this is just the modern times of criminal enterprises.”

Roemer says it is critical to protect personal information, photos and videos, and that includes making social media accounts private. “We’re not telling you to live underneath a box — that’s not the advice. It’s just, where practical, limit the amount of information and especially photos you have of yourself and your family members out there,” he cautioned. “Let’s keep in close mind, children. A lot of times, there’s not a reason for us to be posting pictures of our kids online, so limit that to private accounts. That can really help protect them and, in turn, protect you from these criminals that want to exploit them in any number of different ways.

The FBI says parents should regularly monitor kids’ online activity and talk with them about the risks of sharing photos and videos. The agency also suggests running frequent searches of the names, addresses and phone numbers of family members to help catch unwanted content online. The National Center for Missing and Exploited Children has a free service called Take It Down, which can help people remove sexually explicit photos and videos of themselves that were taken before they were 18 years old, according to the organization.

See a spelling or grammar error in our story? Please click here to report it and include the headline of the story in your email.

Do you have a photo or video of a breaking news story? Send it to us here with a brief description.