Remote Work Recruiting May Invite Fakes

In a column he writes for MarketWatch, Jurica Dujmovic suggests remote work has opened the door to people who may be pretending to be something they may not be in order to get a job.

In his piece, Dujmovic asks readers/listeners to imagine you’re an inexperienced techie who wants to apply for a job at a company. He suggests a roommate or a friend who already has a job and a much better grasp on the concept of job interviews, so you get the friend to do the interview for you.

A “deep fake” is a type of synthetic media in which the face of the person in an existing image or a video is replaced with someone else’s likeness, he writes. The technology was developed in the 1990s, but it became popular thanks to amateur online communities.

Deepfakes can range from funny and quirky to sinister and dangerous videos depicting political statements that were never given or events that never took place. In today’s remote/hybrid world, however, they could be used by someone pretending o be someone else to get a job.

“Now, you may be wondering why anyone would do that — it seems like too much hassle and a risky and stressful approach to getting an actual job. Unless … the reason to get a particular job isn’t to acquire gainful employment, but instead to gain access to the company’s infrastructure so you could divulge sensitive information,” Dujmovic writes. “In that case, you wouldn’t be a camera-shy introvert, but rather a tech-savvy social engineer/hacker, posing as someone else — someone qualified.”

He cited an FBI Alert published June 28 that indicated the number of cases where malicious actors successfully applied for work-at-home positions using stolen personally identifiable information) is on the rise. This data likely were acquired by hacking victims’ accounts or even companies’ HR databases.

The FBI warns that hackers use voice spoofing (imitating another person’s voice by using digital-manipulation tools) and some publicly available deep fake tools (DeepFaceLabDeepFaceLiveFaceSwap, et al.) to fool unwary interviewers.

According to the FBI report, one way to recognize something is afoul to pay attention to “actions and lip movement of the person seen interviewed on-camera,” which “do not completely coordinate with the audio of the person speaking.”

However, this isn’t a totally foolproof way of detecting deep fakes. A growing number of apps enable seamless, real-time integration into video calls, which results in higher-quality lip syncs.

Aside from imperfections in lip syncing, other giveaway clues include facial discoloration, unnatural blinking patterns, blurring, weird digital background noise and a difference in sharpness and video quality between the face and the rest of the video (i.e., the face looks sharper and cleaner than the background image).

Dujmovic cautions that identity theft is nothing new, and tech simply provides hackers with new tools to facilitate the process. He writes, “As always, the crucial step is training and educating staff to withstand the social engineering part of the attack, namely not to give access to vital company infrastructure to new, unvetted hires.”

Dujmovic also writes that while meeting the potential hire in person is always the best way to ascertain identity, it’s not always possible. In these cases, a few things that would mess with the possible deep fake algorithm — asking an interviewee to rotate their body (easily done on a regular office chair), place their face at an awkward angle in front of their camera or place their open palm in front of their face and move the hand at varying speeds, obscuring and showing parts of the face between the fingers. These methods could trick some models into glitching and producing artifacts or blurring that could uncover a deep fake video.

As for audio, he suggests the interviewer watch out for phrasing, choppy sentences and weird tone inflections. Sometimes the entire sentences are pre-synthetized, which can result in canned answers. In this case, watch for responses that seem out of context — if an interviewee does not answer a question, or answers it in the same way multiple times, this could also be a red flag.

“In a job-seeking scenario, keep tabs on your communications and reach out via different channels to your potential employer, if possible,” Dujmovic writes. “A hacker won’t be able to cover all of them, and receiving conflicting information from both you and the hacker will raise HR’s suspicion about a possible imposter.”

He also said that if you think you might be a victim of identity theft, reach out to your local law enforcement and file a report.

 

Photo from Jurica Dujmovic’s piece in MarketWatch.