Project Hyphae

Criminals are Filling Tech Job Applicant Pools with Deepfakes

Share This Post

In March 2021, the FBI also warned in a Private Industry Notification (PIN) that deepfakes (including high-quality generated or manipulated video, images, text, or audio, often generated using artificial intelligence or machine learning technologies) are getting more sophisticated by the day and will likely be leveraged broadly by foreign adversaries in “cyber and foreign influence operations.”

In April 2022, Europol warned that deepfakes could soon become a tool that cybercrime organizations will use on a regular basis in CEO fraud, to spread of misinformation, to tamper with evidence, and to create non-consensual pornography.

Tuesday, the FBI warned of increased complaints that cyber criminals are increasingly using Americans’ stolen Personally Identifiable Information (PII) and deepfakes to apply for remote work positions. The public service announcement, published on the FBI’s Internet Crime Complain Center (IC3), adds that the deepfakes used to apply for positions in online interviews include convincingly altered videos and/or images. The targeted remote jobs include positions in the tech field that would allow the malicious actors to gain access to the company and confidential information after being hired.

While some of the deepfake recordings are convincing, others can be detected easily due to various sync mismatches, mainly the spoofing of applicants’ voices. “Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the FBI added. “In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.” Some victims who reported to the FBI that their stolen PII was used to apply for a remote job also said pre-employment background check information was utilized with other applicants’ profiles.

The FBI asked victims of PII theft as well as companies who have received deepfakes during the interview process to report this activity via the IC3 platform and include any information (IP or email addresses, phone numbers, names used, etc.) that would help identify the criminals behind these actions.

FBI Warning June 2022: https://www.ic3.gov/Media/Y2022/PSA220628
FBI Warning March 2021: https://www.ic3.gov/Media/News/2021/210310-2.pdf
Europol warning April 2022: https://www.europol.europa.eu/media-press/newsroom/news/europol-report-finds-deepfake-technology-could-become-staple-tool-for-organised-crime

More To Explore

The Teams Call is Coming from Inside the House

Researchers at Vectra stumbled across some genuinely troubling design flaws in Microsoft Teams.  Essentially, Teams stores authentication tokens in plaintext capable of granting access to

When Oktapuses Attack

Group-IB, a Singapore based security and threat research company, identified a multiphase smishing (I really hate that word) campaign complete with MFA capture. The campaign

Do You Want to Shore Up Your Defenses?

We're opening our first round of threat hunting engagements to 100 organizations. Sign up or join the wait list here.