Meta ends relationship with controversial African moderation partner

Meta ends relationship with controversial African moderation partner

Meta ends relationship with controversial African moderation partner

On Tuesday, Facebook confirmed it would end its contract with Sama, an outsourcing company responsible for moderating graphic content in East Africa.

The news, first reported by Time Magazine, comes amid a lawsuit filed by a former content moderator employed by Sam in Nairobi, Kenya alleging severe psychological trauma as a result of their work along with other labor law violations.

According to Foxglove Legal, a legal non-profit that investigates Big Tech firms, the $2.2 billion European outsourcing firm Majorel will take over the contract. But Foxglove said that Majorel is no better than Sam at treating moderators – which was also shown by Insider’s reports from last summer showing the disgusting treatment of Majorel’s content moderators working for TikTok’s parent company ByteDance in Morocco.

It’s a sign of how restless the content moderation industry remains. Despite several such lawsuits against social media companies such as Facebook, Youtube, TikTok and Reddit from around the world, employees are still often forced to work multi-hour shifts moderating some of the most violent content on the internet, from child abuse material to videos gruesome accidents and beheadings, often with very little protection for their mental health.

In August, an Insider investigated the working conditions of TikTok content moderators working for Majorel in Morocco, the center of ByteDance’s content moderation operations in the Middle East and North Africa. Employees told us that they often work 12-hour shifts where they report videos of animal abuse, sexual violence and other forms of gruesome content. They had fewer breaks than their US counterparts and said the company’s “health advisors” weren’t helping much.

Social media companies claim to use sophisticated algorithms to help clean up people’s feeds, but that belies the grim reality of how almost every social media company works. Behind the scenes, a global workforce of tens of thousands filters out disgusting content to keep it out of your eyes.

In recent years, Facebook has settled lawsuits with moderators who reported post-traumatic stress disorder as a result of working for the company and promised to make changes to its working conditions. But as Vittoria Elliott and I reported in 2020, relatively modest concessions rarely reach workers in India, the Philippines or Kenya.

Experts like Foxglove Legal are calling on social media companies like Meta to bring in a global content moderation workforce. This may be the only way to ensure that the worst elements of social media are handled by those who are truly accountable to the company and its users. Until then, contractors like Sama or Marjorel or dozens of other outsourcing companies will be paying the price.

Leave a Reply

Your email address will not be published. Required fields are marked *