Tech News
← Back to articles

AI “Companion Bots” Actually Run by Exploited Kenyans, Worker Claims

read original related products more articles

If you’re one of the 28 percent of Americans who’ve shared an intimate relationship with an AI chatbot, we might have some bad news.

Freshly reported testimony from the Data Worker’s Inquiry — an international research initiative empowering gig workers to document their industries — revealed stunning details behind one of the fastest growing consumer niches in the AI sector.

During a period of desperation in which he struggled to find a job in his trained field of global aviation, a Kenyan man named Michael Geoffrey Asia writes for the initiative that he was introduced to the world of data labeling and chat moderation. In Asia’s case, the “chats” turned out to be “romantic and intimate conversations on platforms I’d never heard of.”

Though it was far from what he had planned to do after graduating from aviation school, he took on a job as a “text chat operator” with the Australian firm New Media Services in order to feed his family. He made his home, he writes, in the Mathare slums of Nairobi, and it was all he could do to keep a roof over their heads.

“What I didn’t know was that the role would require me to assume multiple fabricated identities, and use pseudo profiles created by the company to engage in intimate and explicit conversations with lonely men and women,” Asia writes.

To do the job, Asia had to assume various identities, taking on lengthy backstories in order to play the role of “chatbot” for someone on the other side of the world. “Sometimes I would be assigned a conversation that had been ongoing for several days and had to continue it smoothly so the user wouldn’t realize the person responding had changed,” he wrote.

In any given work day, Asia would assume “three to five different personas” simultaneously, all of varying genders. He was paid per message, a flat rate of $0.05 per, which had to meet a required character count. He also had to type at least 40 words a minute, and keep up with a dashboard displaying the total number of messages sent.

“Falling behind on metrics could lead to warnings, reduced assignments, or termination,” Asia explained.

The work was emotionally exhausting, with chat users confiding intimate details about their real-life relationships, as well as their own emotional trauma, falsely believing they were talking to an unfeeling AI chatbot.

“My faith taught me that love should be real, intimacy sacred, and that deception was destructive to both the liar and the deceived,” Asia wrote. “Yet here I was, professionally deceiving vulnerable people who were genuinely looking for connection — taking their money, their trust, their hope, and giving them nothing real in return.”

... continue reading