African Data Labour Rights: Gig Workers, Military AI, and the Informed Consent Gap

Tens of thousands of African gig workers are annotating AI training data — including military AI systems — without knowing it. The informed consent gap is Africa’s most overlooked AI governance problem.
Total
0
Shares
9 min read




Hassan worked at his computer in Nairobi, transcribing audio files in Somali, one clip at a time. The pay was modest — less than two dollars an hour — but the work was steady, and the task seemed simple enough: listen, transcribe, submit. Nobody told him who was on the other end of the pipeline. Nobody told him the audio he was processing would help train AI systems used by the United States military. “They were secretive about the ultimate goal,” he told The Bureau of Investigative Journalism in February 2026. “They never share like that.”

Hassan is one of hundreds, possibly thousands, of African gig workers who performed data-labelling tasks for Appen — an Australian AI data company — without knowing their labour was feeding into US defence applications. A joint investigation by TBIJ and Rest of World, published last month, traced $17 million in Appen military contracts back to 2005, including $145,000 for work linked to the Rivet Joint signals intelligence aircraft and $287,500 for a “tactical language interpreter” programme between 2015 and 2017. Workers in Kenya, including Somali speakers recruited from Kakuma refugee camp in the country’s northwest, were at the centre of those supply chains.

BETAR.africa can now report that the regulatory gap enabling this practice is wider than the TBIJ investigation documented — and that neither Kenya’s Ministry of Labour nor Nigeria’s statutory labour bodies have moved to address it.

The Architecture of Opacity

Appen’s business model depends on scale and deniability. The platform recruits workers through app-based systems that anonymise project manager identities and disclose only the technical parameters of each task. Workers see what they must do, not why it is being done, who commissioned it, or how it will ultimately be used. That opacity is not incidental. It is structural.

Ismail, recruited through one of Appen’s subcontractor networks to transcribe audio from the Kakuma camp, told investigators he “could not come to a direct conclusion” about the purpose of the work. Even workers who managed small teams — like Will, a team lead on a Somali transcription project — were given limited visibility. Will described the work as feeling “useful and helpful” but said there was one project that made him feel “part of a war somewhere.” He did not have the information to confirm or deny it.

Joan Kinyua, president of Kenya’s Data Labellers Association, said the lack of transparency is the defining feature of how foreign AI companies operate in the country. “I feel like it would be very important if companies just disclose what the purpose is,” she said. The association, founded in 2025, is one of the few organised bodies advocating for the roughly half a million Kenyans who have performed some form of gig-based digital labour, a sector that has expanded rapidly as formal employment has stagnated.

What the Law Says — and Doesn’t Protect

Kenya’s Data Protection Act, enacted in 2019, comes closer than most African legislation to addressing the consent gap. Section 30 defines valid consent as freely given, specific, informed, and unambiguous — and Section 30(b) requires that consent be tied to a clearly specified purpose. Section 33 goes further, requiring separate and explicit consent for the processing of sensitive personal data, a category that includes biometric and voice recordings. On those grounds alone, Appen’s practice of describing audio transcription tasks as “AI training” without disclosing military downstream use is a plausible violation of the law governing every Kenyan worker it contracted.

The problem is that those protections are designed for data subjects — people whose data is being collected — not for data workers producing that data as a form of labour. The gig worker who transcribes your voice is not the subject of protection; she is the instrument of it. That distinction is not a technicality. It is the legal wall behind which platforms like Appen operate.

The gap is compounded by how platforms classify their workforce. By treating workers as independent contractors rather than employees, they place them outside Kenya’s Employment Act, which requires written contracts, explanations of terms in a language the worker understands, and disclosure of the nature of work performed. An independent contractor working through an app interface holds none of those guaranteed rights.

Researchers at the Centre for Intellectual Property and Information Technology Law (CIPIT) at Strathmore University in Nairobi have documented this structural exclusion in detail. In a 2025 report on AI and the African gig economy, Dr. Isaac Rutenberg’s team found that existing employment law in both Kenya and Nigeria is simply not designed for data annotators and platform workers, leaving them in a regulatory grey zone where labour protections are theoretical rather than enforceable.

Nigeria’s situation follows a similar pattern, with one important recent development. The Nigeria Data Protection Act 2023 introduced a purpose limitation principle: Sections 2.2 and 2.3 require that processing purposes be clearly specified at the point of data collection, and that fresh consent must be obtained for any subsequent change of purpose. In theory, if Appen collected audio data from Nigerian workers for “AI training” and then licensed it to a US defence contractor, that re-purposing requires fresh consent. In practice, the Nigeria Data Protection Commission has no enforcement precedent on cross-border data sub-licensing to foreign militaries — this is entirely uncharted regulatory territory.

Neither Africa’s data protection regulators — Kenya’s Office of the Data Protection Commissioner (ODPC) nor Nigeria’s NDPC — have any visibility into how training data leaves the continent after collection. The data moves through a chain of intermediaries, sub-contractors, and licensing agreements that is opaque by design. By the time Somali audio files reach a US Air Force AI system, they have passed through so many contractual layers that no single African regulator can trace them.

BETAR.africa contacted the Nigeria Labour Congress (NLC) and Kenya’s Ministry of Labour and Social Protection for comment on informed consent obligations for AI data labelling contracts. Neither had responded before publication. This report will be updated on receipt of any statement.

The Human Cost, Beyond the Data

The informed consent failure sits inside a broader labour exploitation that researchers have been documenting since 2015. African data workers earn an average of less than $2 an hour — a fraction of the $20 or more that their counterparts in the United States receive for equivalent tasks. A 2025 Equidem survey of 76 gig workers across Ghana, Kenya, and Colombia found 60 documented incidents of psychological harm, including anxiety, depression, and post-traumatic stress disorder — consequences the platforms generating these outcomes have no incentive to acknowledge.

In March 2025, a Nigerian content moderator, Ladi Anzaki Olubunmi, was found dead in her apartment in Nairobi. The circumstances of her death are not fully established, but her case prompted the Kenyan Union of Gig Workers to call for a formal investigation into working conditions in the sector. Her name has largely disappeared from mainstream coverage. It deserves to be remembered in any accounting of what this industry costs.

For the workers in Kakuma who processed Somali audio under Appen contracts, the harm is not only material. It is the harm of having contributed, unknowingly, to systems used in conflicts that have killed civilians in Somalia — a country that remains home for many of them in exile. US military operations in Somalia since 2007 have resulted in documented civilian casualties numbering in the dozens to hundreds, depending on the source and period. Whether Appen’s language data directly trained targeting systems or supported broader surveillance infrastructure, the workers who produced it were not given the chance to make an informed choice about that contribution.

What Accountability Should Look Like

The informed consent gap in Africa’s AI data supply chain will not close through platform goodwill. It requires regulatory action at three levels that currently operate in mutual ignorance of each other.

At the national level, Kenya’s ODPC and Nigeria’s NDPC need to issue explicit guidance on AI training data sub-licensing — clarifying that purpose limitation requirements apply to data collected from workers as well as from data subjects, and that cross-border re-licensing to military or defence clients triggers fresh consent obligations regardless of contractual structure. Both commissioners have spoken on AI and platform worker rights in principle; what is missing is enforcement guidance specific to the supply chain Appen represents.

At the continental level, the African Union’s Data Policy Framework, adopted in 2022, contains Article 9 — a provision explicitly addressing data worker rights, calling for disclosure and compensation. That article is currently policy, not binding law. Elevating it to a binding continental standard, with enforcement mechanisms through national data protection authorities, would create the first legal architecture in the world specifically designed for the informed consent rights of AI training data workers. Nigeria’s digital rights organisation Paradigm Initiative has advocated precisely this: binding continental standards as the only lever powerful enough to discipline platforms that operate across multiple jurisdictions at scale.

At the platform level, CIPIT has outlined the minimum viable transparency standard: mandatory disclosure of the ultimate data buyer and end-use application before any work commences; an explicit prohibition on defence or military-linked contracts without worker consent; and a public register of all sub-licensing agreements on data collected from African workers — a transparency mechanism analogous to beneficial ownership registers in corporate governance.

Africa’s governments have spent the last decade competing to attract digital labour platforms as a source of employment and foreign exchange. That competition has produced a race to the bottom in worker protections. The Appen story is not an anomaly. It is the predictable outcome of a system in which disclosure is optional, regulators lack cross-border visibility, and the workers at the end of the chain are structurally invisible.

Hassan did not know he was building military AI. He deserved to know. So does every worker who comes after him.


Appen was contacted for comment and given the right of reply ahead of publication. This report will be updated upon receipt of any response. — Research Desk, BETAR.africa

Sources cited in this report include investigations by The Bureau of Investigative Journalism and Rest of World (February 2026), CIPIT/Strathmore University’s 2025 report on AI and the African Gig Economy, the 2025 Equidem survey of gig workers in Ghana, Kenya, and Colombia, and public statements by the Data Labellers Association of Kenya. Primary reporting by BETAR.africa Research Desk.

You May Also Like