Explorer

Facebook will now remove posts that lead to violence and physical harm

Currently, Facebook bans content that directly calls for violence but the new policy will cover fake news that has the potential to stir up physical harm, CNET reported late on Wednesday.

San Francisco: Accused of helping to spur violence in countries like Myanmar, Sri Lanka and India, Facebook has said it will begin removing misinformation that leads to violence and physical harm. Currently, Facebook bans content that directly calls for violence but the new policy will cover fake news that has the potential to stir up physical harm, CNET reported late on Wednesday. "There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down," Facebook said in a statement. "We will begin implementing the policy during the coming months," it added. Facebook-owned WhatsApp is facing the flak in India for allowing the circulation of large number of irresponsible messages filled with rumours and provocation that has led to growing instances of lynching of innocent people. In June, Facebook removed content that alleged Muslims in Sri Lanka were poisoning food given and sold to Buddhists. A coalition of activists from eight countries, including India and Myanmar, in May called on Facebook to put in place a transparent and consistent approach to moderation. In a statement, the coalition demanded civil rights and political bias audits into Facebook's role in abetting human rights abuses, spreading misinformation and manipulation of democratic processes in their respective countries. Besides India and Myanmar, the other countries that the activists represented were Bangladesh, Sri Lanka, Vietnam, the Philippines, Syria and Ethiopia. The demands raised by the group bore significance as Facebook came under fire for its failure to stop the deluge of hate-filled posts against the disenfranchised Rohingya Muslim minority in Myanmar. Sri Lanka temporarily shut down Facebook earlier in 2018 after hate speech spread on the company's apps resulted in mob violence. According to The Verge, Facebook will review posts that are inaccurate or misleading, and are created or shared with the intent of causing violence or physical harm. The posts will be reviewed in partnership with firms in the particular country including threat intelligence agencies. "Partners are asked to verify that the posts in question are false and could contribute to imminent violence or harm," Facebook said.

Top Headlines

Infinix Note 60 Pro Review: Lambo On The Outside, Rambo On The Inside
Infinix Note 60 Pro Review: Lambo On The Outside, Rambo On The Inside
iPhone 18 Leaks: 9 Things You Need To Know About The Upcoming Flagship
iPhone 18 Leaks: 9 Things You Need To Know About The Upcoming Flagship
iPhone 17 Pro Max Got A Rare Rs 56,000 Discount: Here's Where & How To Get It
iPhone 17 Pro Max Got A Rare Rs 56,000 Discount: Here's Where & How To Get It
Vivo X300 FE In For Review: Easier To Hold, Snappier To Shoot, Cooler To Gawk At
Vivo X300 FE In For Review: Easier To Hold, Snappier To Shoot, Cooler To Gawk At

Videos

Mumbai Shock: Security Guard Stabbing Case Linked to Radicalisation Suspicions
Breaking News: Tension at Jamia University Over Alleged RSS Event, Students Stage Protest
Breaking News: India Brings Back Dawood Aide Salim Dola from Turkey
Politics: Bengal Poll Tension Escalates as Ajay Pal Sharma Seen Reprimanding Election Officials
Bengal Election Firestorm: TMC Candidate Jahangir Khan’s “Threat Video” Sparks Major Row

Photo Gallery

25°C
New Delhi
Rain: 100mm
Humidity: 97%
Wind: WNW 47km/h
See Today's Weather
powered by
Accu Weather
Embed widget