Sat, April 20, 2024

How Facebook is Trying to Deal with Dangerous Content

Fb.com: Facebook coins new digital money.

Social media networks are working hard to remove content that violates their policies. However, it is quite hard to cope with such posts due to the scale of the problem. Hopefully, Facebook is trying to tackle this issue.

This, the social media giant released a new moderation transparency report. Let’s have a look at the report. According to the information released by Facebook, it took down a lot of that violated its policies. The social media giant released a report showing a marked uptick in bullying and harassment enforcement. It reached a peak of 6.3 million takedowns through the last quarter of 2020.

It is an increase from 3.5 million pieces last quarter and 2.8 million pieces in the fourth quarter of 2019. According to the company, much of the change is due to improvements in the automated systems that analyze Facebook and Instagram comments.

People should take into account that, Facebook’s latest transparency report covers October to December 2020. As a reminder, this report covers a period that includes the U.S. presidential election. Importantly, during that period of time, the main Facebook network removed more harassment, organized hate and hate speech, suicide, and self-harm content. This is not the end of the story as Instagram also saw significant jumps in bullying and self-harm removals.

According to Facebook, its numbers depends on two factors: more human capacity as well as improvements in artificial intelligence, especially in non-English posts.

Facebook and its plans

It is worth noting that, the company also indicates it will lean on automation to address a growing amount of video and audio on its platforms. Interestingly, Facebook pushed some moderation teams back into offices in early October. However, the company said in November that most moderators worked remotely. Importantly, the social media giant also said that some sensitive content can not be reviewed from home.

Based on the information provided by the company, increased moderation helped Facebook, as well as Instagram, remove more suicide and self-injury content. It is worth mentioning that, the company removed 2.5 million pieces of violating content, compared to 1.3 million pieces in the preceding quarter. Moreover, Instagram removed 3.4 million pieces, up from 1.3 million.

Interestingly, that is comparable to pre-pandemic levels for Facebook. However, it is a significant increase for Instagram. Hopefully, the social media giant is trying to remove as much harmful content as possible.

YOU MAY ALSO LIKE

Last Week’s Stock Market Overview

Quick Look: Genuine Parts Company (NYSE: GPC) reported Q1 2024 results with

Oil

Quick Look: New sanctions targeting Venezuela’s crude oil sector affect US Gulf

Quick Look: The Electric Atlas features improved strength, range of motion, and

COMMENTS

Leave a Comment

Your email address will not be published. Required fields are marked *

User Review
  • Support
    Sending
  • Platform
    Sending
  • Spreads
    Sending
  • Trading Instument
    Sending

BROKER NEWS

Plus500 Witnesses Growth in Q1 Revenue

In the first quarter, Plus500 reported a $215.6 million revenue, marking a 4% rise Y-o-Y and a 14% increase Q-o-Q. Customer income was $169.6 million, with $30.6 million coming from customer trading performance.

BROKER NEWS

Plus500 Witnesses Growth in Q1 Revenue

In the first quarter, Plus500 reported a $215.6 million revenue, marking a 4% rise Y-o-Y and a 14% increase Q-o-Q. Customer income was $169.6 million, with $30.6 million coming from customer trading performance. The