Job-seeking impostors, including deepfakes, are exploiting the remote work trend, defrauding U. Companies and potentially threatening U. National security, according to experts.
Apximately 17% of hiring managers surveyed said they had encountered candidates using deepfake nology to alter their interviews, according to career platform Resume Genius.
It surveyed 1,000 hiring managers across the United States. By 2028, 1 in 4 job candidates worldwide will be fake, according to re and advisory firm Gartner.
"Deepfake candidates are infiltrating the job market at a crazy, unprecedented rate," said Vijay Balasubramaniyan, CEO of voice authentication startup Pindrop Security, who said he recently caught a deepfake job candidate.
"It's very, very simple right now" to create deepfakes for interviews, Balasubramaniyan said.
"All you need is either a static image" or of another person and a few seconds of audio of their voice, he said.
"Remote jobs unlocked the possibility of tricking companies into hiring fake candidates," said Dawid Moczadlo, co-founder of data security software company Vidoc Security Lab, who recently posted a viral interaction with a deepfake job seeker on LinkedIn.
"If this trend continues and if we experience more and more fake candidates, then we definitely will need to develop some kind of tools to verify if the person is a real person, if they are who they claim to be," Moczadlo said.
While fraudulent job seekers can originate from anywhere, fake candidates with ties to North Korea have drawn significant headlines in recent months.
In May 2024, the Justice Department alleged that more than 300 U. Companies had unknowingly hired impostors tied to North Korea for remote IT roles, resulting in at least $6.
8 million in overseas revenue. The workers allegedly used stolen American identities to apply for remote jobs and employed virtual networks and other niques to conceal their true locations.
"When we hire candidates or fake candidates who are from sanctioned nations, it becomes a national security concern," said Aarti Samani, an expert in AI deepfake fraud prevention.
"The reason it becomes a national security concern is because, once these candidates or these individuals are in an organization, they are taking that salary and funding activities back in those nations.
And those activities can be illicit as well. So inadvertently, we are funding illicit activities in sanctioned nations.
"As AI nology rapidly evolves, fake AI-generated job candidate files are undermining the credibility of the hiring cess.
"The whole reason you need to worry deepfake job seekers is, at the very least, they're making the real employees, potential employees and candidates not able to get the job or [get the] job as easy," said Roger Grimes, a veteran computer security consultant.
"It can create all kinds of disruption, just making the hiring cess longer and more expensive.
""Potentially, you could even be applying for a job and someone's not sure whether you're real or not, and you don't even get that call, and you don't know why you didn't get the call," Grimes said.
"It was all because perhaps they saw something that made them think that maybe you're a deepfake candidate, even when you weren't.
"Watch the above to learn how fake candidates can harm es and what steps can be taken to combat this issue.
Watch now11:4211:42Ghost jobs: What rising fake job listings say the job marketCNBC News and gramming.