Skip to Main Content
PCMag editors select and review products independently. If you buy through affiliate links, we may earn commissions, which help support our testing.

FBI: Scammers Are Interviewing for Remote Jobs Using Deepfake Tech

The scammers may be trying to score jobs at IT companies in order to access customer or financial data, corporate IT databases and/or proprietary information, the FBI says.

By Michael Kan
June 28, 2022
(Facebook/Meta)

Scammers have been exploiting deepfake technology to impersonate job candidates during interviews for remote positions, according to the FBI. 

The agency has recently seen an increase in the number of complaints about the scam, the FBI said in a public advisory on Tuesday. Fraudsters have been using both deepfakes and personal identifying information stolen from victims to dupe employers into hiring them for remote jobs. 

Deepfakes involve using AI-powered programs to create realistic but phony media of a person. In the video realm, the technology can be used to swap in a celebrity’s face onto someone else's body. On the audio front, the programs can clone a person’s voice, which can then be manipulated to say whatever you’d like. 

The technology is already being used in YouTube videos to entertaining effect. However, the FBI’s advisory shows deepfakes are also fueling identity theft schemes. “Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the FBI says. 

The scammers have been using the technology to apply for remote or work-from-home jobs from IT companies. The FBI didn’t clearly state what the scammers' end goal. But the agency noted, “some reported positions include access to customer PII (personal identifying information), financial data, corporate IT databases and/or proprietary information.” 

Such info could help scammers steal valuable details from companies and commit other identity fraud schemes. But in some good news, the FBI says there’s a way employers can detect the deepfakery. To secure the jobs, scammers have been participating in video interviews with prospective employers. However, the FBI noted that the AI-based technology can still show flaws when the scammer is speaking.

“The actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking,” the agency said. “At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

Like What You're Reading?

Sign up for SecurityWatch newsletter for our top privacy and security stories delivered right to your inbox.

This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.


Thanks for signing up!

Your subscription has been confirmed. Keep an eye on your inbox!

Sign up for other newsletters

TRENDING

About Michael Kan

Senior Reporter

I've been with PCMag since October 2017, covering a wide range of topics, including consumer electronics, cybersecurity, social media, networking, and gaming. Prior to working at PCMag, I was a foreign correspondent in Beijing for over five years, covering the tech scene in Asia.

Read Michael's full bio

Read the latest from Michael Kan