Meta Oversight Board Urges Social Media Giant To Reconsider Ending Ban On Arabic Word 'Shaheed': Here's Why
This move comes amid longstanding criticism of Meta's content moderation policies, particularly concerning the Middle East.
Meta's oversight board has urged the tech giant to reconsider its blanket ban on the Arabic term "shaheed," meaning "martyr" in English, reported Reuters. After a thorough year-long review, the board concluded that Meta's approach was overly broad, stifling the speech of countless users unnecessarily.
The independent board, though funded by Meta, emphasised that the company should only remove posts containing "shaheed" if they are directly linked to indications of violence or if they violate other community guidelines. This move comes amid longstanding criticism of Meta's content moderation policies, particularly concerning the Middle East.
ALSO READ: Meta CEO Mark Zuckerberg Enters Fediverse With Threads App. Know What It Is
Concerns Fail To Consider Interpretations
Critics have accused Meta, especially in light of the Israeli-Hamas conflicts, of suppressing content sympathetic to Palestinians. This latest ruling highlights concerns that Meta's rules regarding "shaheed" fail to consider the word's various interpretations, resulting in the removal of content unrelated to violence.
Helle Thorning-Schmidt, co-chair of the Oversight Board, stressed that while Meta aims to enhance safety through censorship, evidence suggests that such measures can inadvertently marginalise entire communities without improving safety outcomes. "Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalize whole populations while not improving safety at all," Thorning-Schmidt said.
ALSO READ: MCA, Meta Partner To Launch WhatsApp Tipline To Detect Deepfakes In India
How Meta Treats 'Shaheed' Now
Currently, Meta automatically removes any posts featuring "shaheed" in connection to individuals or groups it deems dangerous, including members of extremist organisations like Hamas. However, Meta sought the oversight board's guidance after failing to reach an internal consensus on the matter.
As per Reuters, a Meta spokesperson stated that the company would review the board's recommendations and provide a response within 60 days, signalling a potential shift in its content moderation strategy regarding sensitive terms like "shaheed."