Sama, Meta’s content moderation subcontractor in Africa has shut down its Kenyan Content Moderation arm to cut off the showcase of harmful graphics on the platform.
The move affects 200 employees, 3% of the Sama team let go as the company exits content review services, moving focus to labelling work – computer vision data annotation, including positioning animations in augmented reality filters, such as bunny ears.
Sama was first contracted by Meta in 2017 and its decision comes at a time when Meta is facing a lawsuit in Kenya due to claims of failed safety measures on Facebook, which had led to conflicts.
Encouraging affected employees to apply for vacancies at its Kenyan and Ugandan offices, Sama said the current economic climate requires more efficient and streamlined business operations.
Meta affirms having a new partner with similar moderation capabilities and says it respects Sama’s decision to exit the content review services it provides for Meta stating it will ensure there are no negative impacts on the ability to review content.
Meta and Sama are also facing another lawsuit filed last year by Daniel Motaung, a South African and former Sama content moderator, in Kenya. The accusation points to both firms forcing labour and human trafficking, unfair labour relations, union busting and failure to provide “adequate” mental health and psychosocial support.
Comments 1