Read More
Indonesia trims meals programme: what next?
10 hours ago
2 dead after bus rear-ends broken-down taxi on Tsing Ma Bridge
31-03-2026 00:49 HKT
MTR launches first Q-train with new signaling system on Tsuen Wan Line
29-03-2026 17:41 HKT




While AI technology is rapidly advancing, hundreds of young people toil in the shadows of a digital factory in Nairobi, Kenya, tirelessly filtering and labeling content to train Silicon Valley's AI models day and night for merely HK$12 an hour -- a price far too high for their mental well-being.
In addition to the repetitive nature of work, the workers are required to identify and filter the most abhorrent and inappropriate content online, including images and videos of massacres, abuse, and suicides, which prompts many to report mental health issues and problems in their family lives.
To enhance the efficiency of the AI technology, a large group of workers has been hired to sort, label, and sift reams of data to "teach" the AI algorithms to recognize different items, a process also known as "human in the loop".
"My job is to teach AI to think and act like humans," said Naftali Wambalo, a data labeler hired by the outsourcing company SAMA in Nairobi.
With a bachelor's degree in mathematics, Wambalo initially hoped to pursue a career in teaching or academic research, but the saturated local job market drove him into the AI training industry.
Working eight hours a day, Wambalo and his co-workers are required to classify and label massive amounts of data to train the AI models for companies like Meta, OpenAI, Microsoft, and Google.
Wambalo revealed with resignation that his tasks include filtering pornographic, hateful, and excessively violent content on social media platforms—encountering the worst content daily.
Despite claims from the outsourcing company that they provide psychological counseling, Wambalo pointed out the insufficient support for the employees, calling for qualified mental health professionals.
After repeatedly encountering sexual and pornographic content, Wambalo has developed a sense of disgust, which has significantly affected his marriage.
Another data labeler shared that he was misled into expecting translation work but ended up reviewing graphic content involving dismemberment and drone strikes, leading to depression and communication difficulties.
As many tech giants outsource labeling work to countries with lower wages and higher unemployment rates, such as Kenya, India, the Philippines, and Venezuela, in order to reduce operational costs, this results in more job prospects, along with rising ethical concerns.
For instance, it is reported that SAMA charges OpenAI around HK$100 per hour for each annotator, but the actual payment to employees is only about HK$12.
According to 2023 data, Kenya's unemployment rate is approximately 6 percent, with a staggering 67 percent unemployment rate among youth aged 15 to 34.
To promote economic growth with technology development, the Kenyan government has encouraged outsourcing companies like CloudFactory and SAMA to establish businesses in the country.
However, critics pointed out that the introduction of outsourcing companies will only put Kenya, once dubbed the "Silicon Savannah", into a data labeling distribution center.
Although labor laws exist in the country, some labor unions have pointed out the weak regulations protecting workers, which expose them to prolonged mental stress in a harmful working environment without effective support.
Other critics suggest that this phenomenon not only harms the labor market but also represents a form of "digital colonialism," where multinational tech companies profit from technology while exploiting Kenyan data workers as "data slaves" who are unable to share in the benefits of technological advancement.
(Phoebe Poon)
Download The Standard app to stay informed with news, updates, and significant events: