Home Advertising Content Reviewers Sent Home for Covid-19 Affect Enforcement on Facebook, Instagram

Content Reviewers Sent Home for Covid-19 Affect Enforcement on Facebook, Instagram

0
Content Reviewers Sent Home for Covid-19 Affect Enforcement on Facebook, Instagram

Facebook shared the sixth edition of its Community Standards Enforcement Report Tuesday, citing technology improvements, including the addition of more languages to its automation technology, for progress in the second quarter of 2020, covered by the report, compared with the first quarter.

Vice president of integrity Guy Rosen said in a Newsroom post introducing the report that while its content reviewers have been home since March due to the coronavirus pandemic, which affected reporting and actions taken in some of the categories it analyzed across Facebook and Instagram, other categories saw improved results due to the addition of Arabic and Spanish to the social network’s automation technology, as well as improvements in how it processes English.

Rosen wrote, “We rely heavily on people to review suicide and self-injury and child exploitative content and help improve the technology that proactively finds and removes identical or near-identical content that violates these policies. With fewer content reviewers, we took action on fewer pieces of content on both Facebook and Instagram for suicide and self-injury, and child nudity and sexual exploitation on Instagram. Despite these decreases, we prioritized and took action on the most harmful content within these categories.”

He added that the updates to its technology led to an uptick in its proactive detection rate for hate speech on Facebook, to 95% in the second quarter from 89% in the first quarter, and a surge in pieces of content that were acted upon for this reason, to 22.5 million from 9.6 million.

The quarter-to-quarter changes were more dramatic on Instagram: 84% from 45%, and 3.3 million pieces of content, compared with 808,900.

Rosen also pointed to a gain in pieces of terrorism content that were acted upon, to 8.7 million in the second quarter from 6.3 million in the first quarter.

Here are some specifics from the 12 policies analyzed for Facebook and the 10 for Instagram:

  • Adult nudity and sexual activity: On Facebook, action was taken on 35.7 million pieces of content in the second quarter, down from 39.5 million in the first quarter, which the company attributed to temporary workforce changes brought on by Covid-19. However, improvements to its proactive detection technology led to the opposite result on Instagram: action on 12.4 million pieces of content, up from 8.1 million, and an increased proactive detection rate of 96.1% versus 93.8%.
  • Bullying and harassment: Facebook saw a slight uptick, with 2.4 million pieces of content acted on in the second quarter compared with 2.3 million in the previous period, while the difference was again more dramatic on Instagram, at 2.3 million and 1.5 million, respectively. Facebook said it regained some review capacity in this category during the second quarter.
  • Child nudity and sexual exploitation of children: Facebook took action on 9.5 million pieces of content in the second quarter, up from 8.6 million in the first quarter, which it attributed to “a few pieces of violating content that were shared widely in March and April.” The decrease in content reviewers caused Instagram’s numbers to drop to 479,400 in the second quarter from 1 million in the previous period.
  • Dangerous organizations (terrorism and organized hate): Facebook and Instagram trended in different directions on both topics. For terrorism, Facebook took action on 8.7 million pieces of content in the second quarter, up from 6.3 million in the first quarter, while Instagram’s total slipped to 388,800 from 440,600. For organized hate, Facebook took action on 4 million pieces of content in the second quarter, down from 4.7 million, while Instagram did so on 266,000 pieces of content, up from 175,100.
  • Hate speech: Facebook credited the improvements in its proactive detection technology and the addition of Arabic, Indonesian and Spanish for rises to 22.5 million pieces of content acted upon in the second quarter from 9.6 million in the first quarter on Facebook, while those numbers on Instagram were 3.3 million and 808,900, respectively. The social network said its proactive rates climbed to 94.5% from 88.8% on Facebook and to 84.2% from 44.5% on Instagram.
  • Regulated goods (drugs and firearms): Facebook took action on 5.8 million pieces of content in the second quarter, down from 7.9 million in the first quarter, saying changes to its technology temporarily impacted its enforcement on posts in groups. Its proactive rates remained similar across both Facebook and Instagram.
  • Spam: Facebook wrote in its report, “In the first quarter, our proactive detection technology removed a large amount of non-violating content, which we later restored. In April, we updated our technology to continue improving accuracy. As a result, content actioned decreased from 1.9 billion pieces of content in the first quarter of 2020 to 1.4 billion in the second quarter.”
  • Suicide and self-injury: Fewer content reviews led to action on fewer posts on both platforms, with Facebook’s total dropping to 911,000 in the second quarter from 1.7 million in the first quarter, while Instagram slid to 275,000 from 1.3 million. The social network wrote, “Despite this decrease, we prioritized and took action on the most harmful content within this category.”
  • Violent and graphic content: Fewer content reviewers caused Facebook’s pieces of content acted upon to drop to 15.1 million in the second quarter from 25.4 million in the first quarter, while the resolving of a first-quarter measurement issue for Instagram caused that platform’s total to rise to 3.1 million from 2.8 million.





Source link

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments