How Facebook is Trying to Deal with Dangerous Content

57 Facebook coins new digital money.

Social media networks are working hard to remove content that violates their policies. However, it is quite hard to cope with such posts due to the scale of the problem. Hopefully, Facebook is trying to tackle this issue.

This, the social media giant released a new moderation transparency report. Let’s have a look at the report. According to the information released by Facebook, it took down a lot of that violated its policies. The social media giant released a report showing a marked uptick in bullying and harassment enforcement. It reached a peak of 6.3 million takedowns through the last quarter of 2020.

It is an increase from 3.5 million pieces last quarter and 2.8 million pieces in the fourth quarter of 2019. According to the company, much of the change is due to improvements in the automated systems that analyze Facebook and Instagram comments.

People should take into account that, Facebook’s latest transparency report covers October to December 2020. As a reminder, this report covers a period that includes the U.S. presidential election. Importantly, during that period of time, the main Facebook network removed more harassment, organized hate and hate speech, suicide, and self-harm content. This is not the end of the story as Instagram also saw significant jumps in bullying and self-harm removals.

According to Facebook, its numbers depends on two factors: more human capacity as well as improvements in artificial intelligence, especially in non-English posts.

Facebook and its plans

It is worth noting that, the company also indicates it will lean on automation to address a growing amount of video and audio on its platforms. Interestingly, Facebook pushed some moderation teams back into offices in early October. However, the company said in November that most moderators worked remotely. Importantly, the social media giant also said that some sensitive content can not be reviewed from home.

Based on the information provided by the company, increased moderation helped Facebook, as well as Instagram, remove more suicide and self-injury content. It is worth mentioning that, the company removed 2.5 million pieces of violating content, compared to 1.3 million pieces in the preceding quarter. Moreover, Instagram removed 3.4 million pieces, up from 1.3 million.

Interestingly, that is comparable to pre-pandemic levels for Facebook. However, it is a significant increase for Instagram. Hopefully, the social media giant is trying to remove as much harmful content as possible.

  • Support
  • Platform
  • Spreads
  • Trading Instument

For more news updates, visit our homepage now and see our latest news article. Want to learn more about trading? Visit our education page now and learn for FREE!


Leave a Reply