When applying for jobs, Angel talks up her language skills. “I can speak fluent English, I can speak good Chinese, I also speak Russian and Turkish,” the glamorous, 24-year-old Uzbekistani woman explains in a selfie-style video made for recruiters. Angel had arrived in the Cambodian city of Sihanoukville that day, she said, and was ready to start work immediately.
Those impressive language skills, however, have likely been put to use as part of elaborate “pig-butchering” scams targeting Americans. That’s because, instead of applying for a conventional corporate job, Angel was putting herself forward to work as an “AI face model”—sitting in front of a computer all day and making deepfake video calls to manipulate potential scam victims. Her application, which also required her height and weight, says she has already clocked up “1 year as an AI model.”
Angel is far from alone in this pursuit. A WIRED review of dozens of recruitment videos and job ads posted to Telegram show people from around the world—including Turkey, Russia, Ukraine, Belarus, and multiple Asian countries—applying to be AI models or “real face” models in Cambodia and Southeast Asia. The region has become home to vast, industrialized scamming operations that hold thousands of human trafficking victims captive and force them to run online cryptocurrency investment and romance scams.
As well as tricking people into working in scam compounds, these high-tech, multibillion-dollar criminal enterprises can also attract people into seeking “work” as part of the operations. “In the past year until today, they are also hiring people doing AI modeling,” says Hieu Minh Ngo, a cybercrime investigator at the Vietnamese scam-fighting nonprofit ChongLuaDao. “They will give you the software so they can swap their face by using AI and they can do romance scams,” he says.
Ngo, a reformed criminal hacker who now tracks scam compound activity and supports victims, identified around two dozen channels on Telegram that have some job postings for AI models in the region. Humanity Research Consultancy, an anti-human-trafficking organization, has also tracked people applying on Telegram for jobs in “known scam hub cities” as “models” and “AI models,” including Angel’s application.
The rise of AI models comes as cybercriminals are broadly adopting AI and using face-swapping as part of their online scamming. Typically, fraudsters will use fake personas to contact potential victims on social media or messaging platforms. They will often use stolen images of celebrities or attractive men or women to entice a person into talking to them.
Once they make contact, they will then bombard them with attention to help build up a relationship, before trying to get them to part with their cash. In some instances, multiple people may control the scammers’ account and message the victim under a single fake persona. But if a potential victim asks for a video call during these interactions—to check if the person they are speaking to is real, for instance—that’s when deepfake video calls and models who have their faces swapped can be used. Some Southeast Asian scam centers have dedicated “AI rooms” where the calls are made from.
Job advertisements for AI models or “real models” reviewed by WIRED demand excessive working hours, offer little free time, and require a relentless schedule. The ads are usually posted by a channel administrator and don’t include contact details or list who someone would specifically be working for. One recruitment post for an alleged six-month contract says the person will need to send photos daily, make video and voice calls, and create audio and video messages. “Approximately 100 video calls per day,” the post says.