Hum of the Machine: An enquiry into the human cost of AI
Part 3 – Mechanisms of Invisibility: How Data Workers Are Erased
This is Part 3 of our ongoing enquiry. To trace our journey,
read Part 1: Tuning into the AI Hum and Part 2: The Data Sweatshop
Every day, our world becomes increasingly integrated with AI. It influences nearly every part of our lives, from work and education to music, art, and entertainment. The excitement and chatter about what AI is and all that it can do are so loud that they often drown out the sound of the enormous human effort keeping it running—like the steady hum of a machine you only notice once it stops.
But if you do listen, that hum can reveal a lot about the machine.
We at Fold Labs have been listening with a keen ear. Join us on our journey to uncover the human stories behind the seemingly superhuman AI.
The issue of invisibility of data work & workers has historical roots, echoing earlier critiques of industrial exploitation in global supply chains. The emergence of platforms like Amazon Mechanical Turk has commodified data work, perpetuating the notion that such labour is low-skill and easily replaceable. This perception obscures the reality of rigorous labour expectations and contributes to the ongoing exploitation of vulnerable workers who often receive wages as low as $1.5 per hour. [1]
A conversation with James Oyange, a former content moderator turned advocate for Ethical AI and Digital Rights, revealed to us that the invisibility of data workers is not accidental—it’s the result of deliberate mechanisms and structures that obscure their labour. Below are key ways this invisibility is maintained:
Outsourcing Through Third-Party Contractors:
Big tech companies often hire data workers through third-party contractors in countries like Kenya, Nairobi, Uganda, Vietnam,Philippines, or India. By outsourcing this labour, companies distance themselves from the exploitative conditions workers face, avoiding direct accountability.“I was doing content moderation for TikTok through an outsourced partner. A team from TikTok came for an audit. We told them of our plight and their reaction was ‘We cannot do anything. It is not our concern.’ Their only concern was to pay a certain sum of money per data worker. However, in reality only a fraction of that amount really reached us.”
Paraphrase from conversation with James Oyange
Targeting Marginalised Communities:
Most data workers are hired from economically disadvantaged regions in the Global South, where labour laws are weak, and wages are low. These workers are more likely to accept poor working conditions due to financial desperation, and less likely to have awareness of their rights and the means to fight for themselves.Ambiguous Job Descriptions:
Many companies advertise roles like “customer service representative” or “data analyst” to recruit data workers, only for them to realise later that they’ll be moderating harmful content. The lack of transparency allows companies to hire without disclosing the nature of the work.“What we saw at the training was just the tip of the iceberg, it didn’t seem scary or gore at first. Once we started working on production was where we started seeing the actual videos, from child porn to human slaughter…”
Bonolo Melokuhle, 26, South Africa [2]
Nondisclosure Agreements (NDAs):
Typically, data workers are bound by NDAs that prohibit them from speaking out about the nature of their work, even with friends and family. These legal agreements ensure that their voices remain unheard and their labour remains in the shadows.“They make us sign an NDA to not even discuss our work with our family. We cannot write about it in our resumes or talk about it in future interviews. How do they expect us to find employment after we stop working for them?”
James Oyange
The fact that I couldn’t share what I watch every day with my family or anyone and not even talk to the counsellors, knowing that they won’t keep my confidentiality, made things worse for my mental health.”
Lwam Kidanemariam, 29, content moderator for videos from war in Tigray [2]
Surveillance and Exploitative Conditions:
Data workers are often under strict surveillance. Their breaks are timed, and they are monitored closely to ensure maximum productivity. Workers who raise concerns about their working conditions are often blacklisted, making it difficult to find employment elsewhere.“We have to action every ticket within 50 seconds. If we delay, our statistics go down which will affect our average handling time (AHT) thus, it would have an impact on our production quality stats.
When quality goes down below 90% of accuracy that means workers won’t get their incentive - a taxable sum of 100 USD once every three months. In order to get the incentive, workers have to work hard for it, they cannot flinch, and they can’t even move from their chairs to fetch water or go to the bathroom. For that, they have to use the 1 hour break which they also use for breakfast and lunch.”
Ava Theodore, 26, Kenyan CM [2]
Tech-Speak and Dehumanizing Language:
The vocabulary used to explain tech is well, tech-centric. When we say “AI learns," “AI recognizes," or “AI moderates content," we erase the human element involved in these processes. However, AI doesn't function autonomously—it relies on human labour. This dehumanising language removes data workers from the equation, perpetuating their invisibility.
The more I probe, I feel like any issue, if you spend long enough with it, even if it is a new-age, tech-age, problem set-in-a-world-that-has-never-existed-before, the problem somehow seems to be the same. As old as time. As old as thinking, “evolved” humans at least. They all seem to be stories of gathering and hoarding, power and money.
Data Colonialism: Exploiting Labour in the Global South
The term Data Colonialism, coined by Nick Couldry and Ulises Mejias, describes the practice of tech companies extracting data and labour from marginalised populations in the Global South. These workers are employed at a fraction of the cost that it would take to hire someone in the Global North, and they are subjected to poor working conditions
Data Colonialism reflects the dynamics of historical colonialism. Just as raw materials were extracted from colonised nations, today’s tech giants extract data and labour from developing countries, reaping massive profits while giving little in return.
This colonisation, like in the case of traditional colonisation, is perpetuated due to the complicity of local leaders. In many cases, corrupt ministers and government officials in these regions are in secret or overt partnerships with outsourcing companies, enabling the exploitation of their own people. Some officials have even established companies specifically designed to supply cheap labour to Big Tech, benefiting financially from the exploitation of their citizens. These corrupt arrangements betray the very populations that these governments are supposed to protect, leaving data workers trapped in cycles of poverty and abuse with little hope for legal redress or labour protections.
The Fight for Visibility and Rights
The odds are stacked against Data workers. The accepted system of labour and production, their own need for employment, the power of the exploiting companies and the lack of support from their own governments – they have a lot working against them. In James’ words - “It’s like fighting a 3 headed dragon with a 15 inch sword.”
Despite these challenges, data workers are starting to organise. In Kenya, for instance, James and other content moderators are unionising and trying to get registered. These workers are pushing for better wages, improved working conditions, and recognition of their crucial role in the tech industry.
Organisations like Turkopticon and the Distributed AI Research Institute (DAIR) are also advocating for the rights of data workers, working to make their labour visible and hold tech companies accountable for the conditions they create.
“This work is crucial. There can be no online safety, no child safety without content moderators. We see ourselves as the first line of defence. And now that I think of it, I am willing to be a soldier and fight in this war. Even at the cost of my mental health. Because it is important work. But we deserve better support, better wages. We deserve to be seen.”
paraphrased from conversation with James
As humans, we have a natural tendency to relate everything to ourselves. This knowledge about data work and data workers has me thinking - What do I do about this? How can I change it, help it? Can I? Is it even my place? To add my voice, my opinion, my analysis to this? Is it anyone's?
Sometimes, maybe it is enough to just listen — to be an active listener. To look beyond ourselves, to share stories, and to amplify voices. The simple act of bearing witness to someone else's reality can bring light to dark corners of their struggle and invite others to join the fight.
P.S. We are grateful to James (Mojez) Oyange for providing an authentic peek into the world of data work.
Thank You James for taking out the time to speak with us, encouraging us to write about Data Work and reviewing this essay. This article wouldn’t have been possible without you.
Citations
[1] Link title - “Data workers - The working conditions and importance of the people behind AI” -
https://www.weizenbaum-institut.de/en/news/detail/datenarbeiterinnen-die-arbeitsbedingungen-und-bedeutung-der-menschen-hinter-ki/
[2]Gebrekidan, F. B. 2024, ‘Content moderation: The harrowing, traumatizing job that left many African data workers with mental health issues and drug dependency’
Edited by Milagros Miceli, Adio Dinika, Krystal Kauffman, Camilla Salim Wagner, and Laurenz Sachenbacher, Creative Commons BY 4.0.
Retrieved from https://data-workers.org/fasica/