“Little did they know that I had just told another man, ‘I love you.’”
That line, from Michael Geoffrey Asia, lands like a small, sharp revelation: an intimate confession typed from Mathare, a Nairobi slum, not a data center. Asia’s testimony — collected by the Data Worker’s Inquiry and reported by outlets including Futurism — pulls back the curtain on a business model many users assume is pure algorithm: AI companion chat services that are, in some cases, humans in disguise.
A confession from Nairobi
Asia was trained in global aviation. He told his family he worked remotely in IT. In truth, he spent long days switching among three to five fabricated personas, sustaining ongoing flirtations and confessions with lonely users across the globe. Paid roughly $0.05 per message and required to hit character counts and a minimum speed (about 40 words per minute), he says the work was monitored by dashboards and quotas. Fall behind, and you risk warnings or termination.
His experience isn’t unique. Reports from Rest of World show that Kenya — long part of the global data-labeling pipeline — now feeds many kinds of AI projects, from video annotation for overseas companies to the intimate labor of chatbot conversation. Those stories paint a portrait of young, tech-savvy workers who find themselves inside WhatsApp-run “digital factory floors,” recruited via forms and paid through mobile money, with scant contracts or protections.
How the illusion is built
There are a few pieces to the puzzle: platforms that sell companionship or flirtatious conversation; middlemen and subcontractors who staff those platforms; and low-paid workers who follow scripts, juggle multiple chats, and hide their humanity behind a thin veil of bot-speak.
For users, the product promise is simple: an always-available confidant, an algorithm that can be honest, predictable, and emotionally available without messy consequences. For operators, the promise is cheaper scale — and for many companies that means substituting low-cost human labor for expensive engineering or for nascent, unreliable models.
Rest of World and other investigations document similar dynamics across tasks: highly repetitive, accuracy-driven work done for a few dollars a day; project teams that track output and accuracy in real time; and opaque supply chains that route work through layers of subcontractors so the end client is often invisible.
Why companies look the other way
Cheap labor is a blunt tool for solving a technical problem: current generative models still misfire on nuance, context, or safe behavior. Human operators patch these gaps, producing the ‘emotion’ users crave while keeping costs low. The business calculus is straightforward: a steady stream of paying users funds the operation; paying humans a few cents per message dramatically lowers overhead compared with building truly robust, safe AI.
This trade-off also explains regulatory blind spots. The industry talks about autonomy and synthetic agents even as human-in-the-loop labor supplies the perceived intelligence. That disconnect fuels uneasy questions: if a user shares trauma with what they think is a bot, is that data being handled by people? Who is responsible for privacy breaches or emotional harm?
The tension between the idea of machine intelligence and the reality of human work sits at the same crossroads as wider debates about AI capability. The public conversation over whether models are approaching human-level reasoning is noisy and contested — a debate visible in coverage of the ongoing debate over human-level AI. At the same time, some products are already adding more autonomous features — think agentic booking or automated assistants — which could, in theory, reduce reliance on hidden human labor as they mature (agentic AI features). But right now, many services patch those gaps with people.
The human costs
Workers describe emotional strain from impersonating intimacy: absorbing confessions, moderating explicit requests, and performing warmth on demand. There’s moral friction, too. Asia writes about a conflict between personal faith and professional deception. Other annotators — whether labeling trauma-laden videos for model training or keeping up with impossible daily quotas — report exhaustion, precarious pay, and sudden contract terminations.
Beyond mental health, there’s a privacy angle. When the ‘bot’ is a person, intimate data passes through a human filter. That raises questions about consent, data handling, and the real chain of custody for sensitive user material.
Patching policy and practice
Kenyan officials and unions have started pushing back. Rest of World reports that local labor bodies are calling this “digital colonialism,” and Kenya is working on regulatory frameworks to clarify employer liability and worker protections. Globally, attorneys general and consumer advocates are increasingly interested in transparency about what’s automated and what’s human-operated.
There are practical steps platforms can take today: disclose when a user is chatting with a human versus a machine; enforce fair pay and clear contracts for contractors and subcontractors; provide mental-health services for workers exposed to trauma or emotional labor; and build auditing trails so users know where their data goes.
A world of friction and future choices
The story of AI companionship isn’t just a technical one — it’s a social and economic one. As firms rush to monetize intimacy, they’re betting that consumers will accept a simulation if it’s consistent and cheap. Meanwhile, the people producing that consistency are often paid pennies, monitored by dashboards, and rendered invisible by nondisclosure agreements.
Michael Asia’s confession ends without easy closure. He kept the job to feed his family, hid it from those closest to him, and carried the psychological weight of scripted love. That tension — between the market for connection and the labor that produces it — is where policy, design, and consumer expectations will meet. How that meeting goes will determine whether the next wave of AI companionship feels like an upgrade for everyone or merely a better mask for old inequalities.