In 2025, the world’s largest social media firm, Meta, has taken a defiant new tone on the query of whether or not and to what extent it accepts duty for the real-world hurt that its platforms allow.
This has been broadly understood as a gambit to curry favour with President Donald Trump’s administration, and Meta CEO and founder Mark Zuckerberg all however stated so in a January 7 video saying the top of third-party fact-checking.
“We’re going to work with President Trump to push again on governments world wide, going after American firms and pushing to censor extra,” Zuckerberg stated, giving his product selections a definite geopolitical flavour.
To justify the corporate’s selections to dispose of fact-checking and reduce content material moderation on its platforms, Zuckerberg and Meta have appealed to the USA’ constitutional safety of the proper to freedom of expression. Luckily, for these of us residing within the international locations Meta has vowed to “push again on”, now we have constitutions, too.
In Kenya, for instance, the place I characterize a gaggle of former Meta content material moderators in a class-action lawsuit towards the corporate, the post-independence structure differs from these within the US and Western Europe with its express prioritisation of elementary human rights and freedoms. The constitutions of a fantastic many countries with colonial histories share this in frequent, a response to how these rights had been violated when their peoples had been first pressed into the worldwide financial system.
We are actually starting to see how these constitutions might be delivered to bear within the international expertise business. In a landmark choice final September, the Kenyan Court docket of Attraction dominated that content material moderators might carry their human rights violations case towards Meta within the nation’s labour courts.
Few within the West may have understood the significance of this ruling. Meta, for its half, certainly does, which is why it fought towards it tooth and nail in court docket and continues to make use of each diplomatic device at its disposal to withstand the content material moderators’ calls for for redress. Meta has proven curiosity in interesting this choice to the Supreme Court docket.
Meta and different main US firms keep a convoluted company structure to keep away from publicity to taxes and regulation within the dozens of nations the place they do enterprise. They generally declare to not function in international locations the place they depend tens of millions of customers and make use of tons of to refine their merchandise. Till now, these claims have hardly ever been challenged in court docket.
The case content material moderators have offered in court docket is that they had been employed by a enterprise course of outsourcing (BPO) firm known as Sama, and put to work solely as content material moderators on Fb, Instagram, WhatsApp and Messenger in the course of the interval from 2019 to 2023, when a lot of the moderation for African content material on these platforms was carried out in Nairobi. Meta disavows these employees and insists they had been employed solely by Sama, a difficulty presently being litigated earlier than the courts in Kenya.
These employees know that Meta’s obvious reversal on content material moderation is something however. As offered of their grievances to the court docket, the corporate has by no means taken the problem severely. Not severely sufficient to cease the civil and ethnic conflicts, political violence, and mob assaults towards marginalised communities that thrive on its platforms. Not severely sufficient to pay honest wages to the folks tasked with ensuring it doesn’t. The hurt travels each methods: Poisonous content material inflames real-world horrors, and people horrors engender extra poisonous content material which saturates the platforms.
Content material moderators are digital cannon fodder for Meta in a battle towards dangerous content material that the corporate was by no means actually dedicated to preventing. The case offered by the Nairobi content material moderators explains how they accepted jobs they thought would contain name centre and translation work. As an alternative, they ended up in Meta’s content material moderation hub in Nairobi, the place they spent their days subjected to an limitless torrent of streamed violence and abuse.
Lots of them had been compelled to view atrocities dedicated of their dwelling international locations in an effort to shield Meta’s customers from the harms of seeing these pictures and pictures. They absorbed that trauma so others of their communities didn’t should, and plenty of discovered this to be a noble calling.
However this work took its toll on their psychological well being. Greater than 140 former content material moderators have been recognized with PTSD, despair, or nervousness arising from their time on the job. A separate case addresses how efforts to unionise to advocate for higher psychological healthcare had been thwarted. What adopted was en masse layoffs and relocation of Fb content material moderation elsewhere.
This left behind tons of of trauma-impacted folks and a path of human rights violations. Meta argues that it by no means employed the Fb content material moderators and bore no duty to them. This litigation is ongoing, and the moderators now depend on the courts to unravel the complexities of their employment dynamics.
Whereas preventing the case in court docket, in March 2024, the corporate despatched a delegation led by its then-president of world affairs, Nick Clegg – a former British deputy prime minister – to satisfy with Kenyan President William Ruto and legislators to debate, amongst different subjects, the corporate’s imaginative and prescient of partnership with the federal government in bringing the “generative AI revolution” to the continent. At a townhall occasion in December, Ruto assured Sama, Meta’s former content material moderation companion: “Now now we have modified the regulation, so nobody can ever take you to court docket once more on any matter,” referring to a invoice handed in Kenya’s parliament that shields Massive Tech firms from future instances resembling ours.
All this pushback occurred properly earlier than Trump was re-elected, and these efforts seem like makes an attempt to evade accountability for the corporate’s labour practices and the results of its merchandise. However one thing outstanding occurred, which opens a door for others world wide who labour on behalf of the tech business however whom the business itself disavows: The court docket dominated that our case can proceed to trial.
The truth that the case has superior regardless of vigorous authorized and political challenges is a testomony to the revolutionary nature of post-colonial constitutions, which prioritise human rights above all else.
As our case in Kenya continues, I hope it could possibly provide inspiration for tech employees in different post-colonial nations that they can also pursue accountability within the international locations the place they’ve been harmed. The best to freedom of expression is a vital human proper, however we’ll proceed to remind Massive Tech that equally vital are the proper to dignity and freedom from exploitation.
The views expressed on this article are the creator’s personal and don’t essentially mirror Al Jazeera’s editorial stance.