Facebook’s efforts to minimize the prevalence of violating content have shown promising results

Facebook's efforts to minimize the prevalence of violating content have shown promising results

KEY POINTS

● Prevalence, defined as the estimated number of views showing violating content divided by the total content views, serves as a crucial metric for evaluating FB’s performance in upholding its policies and minimizing the impact of violations on its user base.

● Notably, the prevalence rate for hate speech experienced a substantial improvement over the years. From approximately 10-11 in 2020 Q3, the prevalence decreased significantly to 2 by 2023 Q4.

● From 2019 Q4 to 2023 Q4, the total number of actions taken on FB and Instagram for adult nudity & sexual activity violations has increased gradually between 2017 and 2023.

Table of Contents

Facebook (FB) has been actively monitoring and assessing the prevalence of violating content on its platform to ensure a safe and enjoyable experience for its users. Prevalence, defined as the estimated number of views showing violating content divided by the total content views, serves as a crucial metric for evaluating FB’s performance in upholding its policies and minimizing the impact of violations on its user base. This report aims to analyze the trends in prevalence rates for various types of violating content on Facebook.

In 2019 Q1, the prevalence of content violating FB standards related to adult nudity & sexual activity reached an all-time high, with an average of 12-14 violations per 10,000 content views. However, over the subsequent years, this figure witnessed a significant decline, with the prevalence dropping to approximately 7 violations per 10,000 views by 2023 Q4. The prevalence rate for bullying and harassment exhibited a gradual decrease from 10 in 2021 Q3 to 8 by 2023 Q4. This downward trend reflects FB’s ongoing efforts to combat such harmful behaviours on its platform.

Fig: 1

Notably, the prevalence rate for hate speech experienced a substantial improvement over the years. From approximately 10-11 in 2020 Q3, the prevalence decreased significantly to 2 by 2023 Q4, indicating successful mitigation of hate speech content on Facebook. Similarly, the prevalence rate for violence and incitement demonstrated an improvement from 2017 Q4, where it stood relatively high at around 16-19.  Although there was a rapid decline post-2019, recent years have seen a slight increase in prevalence, highlighting the ongoing challenge of addressing violent and graphic content effectively (fig: 1).

Facebook’s efforts to minimize the prevalence of violating content have shown promising results across various categories. The decline in prevalence rates for adult nudity & sexual activity, bullying and harassment, hate speech, and violence and incitement signifies FB’s commitment to maintaining a safer and more inclusive online environment. However, challenges remain in sustaining these improvements and addressing emerging content moderation issues effectively. Continued vigilance and proactive measures will be essential in reducing the prevalence of harmful content on the platform.

Fig: 2

Facebook (FB) measures the number of pieces of content (such as posts, photos, videos, or comments) or accounts that it takes action on for violating FB standards. This metric reflects the scale of enforcement activity, which may include removing content, applying warnings to disturbing photos or videos, or disabling accounts. Actions escalated to law enforcement are not additionally counted. While tempting, interpreting the content-actioned metric as an indicator of FB’s effectiveness in finding violations or their impact on the community should be done cautiously. The volume of content actioned is only part of the picture, failing to reflect the detection time or the frequency of user exposure to violations.

From 2019 Q4 to 2023 Q4, the total number of actions taken on FB and Instagram for adult nudity & sexual activity violations has increased gradually between 2017 and 2023. Similar to adult nudity & sexual activity, an increasing trend in enforcement actions can be observed for bullying & harassment violations over the same period, with a slight dip in 2022 Q4. The highest number of actions on FB and Instagram were taken for violations related to sexual exploitation, followed by child nudity and physical abuse, indicating a focus on combating child endangerment. In the domain of dangerous organizations, more actions were taken for terrorism compared to organized hate. Notably, the number of actions for terrorism increased from 2021 to 2023, while those for organized hate decreased. Among all the policy areas, for spam, and fake accounts the largest number of actions were taken by FB and Instagram.

In 2019 Q4, the highest number of actions were taken for spam, fake accounts, and hate speech on FB and Instagram. However, over the years, there has been a decline in enforcement actions for these violations. Regarding regulated goods, such as drugs and firearms, the highest number of actions were taken for drugs compared to firearms. Notably, from 2019 Q4 to 2022 Q4, there was a significant increase in actions taken for drugs. Enforcement actions for suicide and self-injury violations have increased over the years, indicating a proactive approach to addressing such sensitive content. Actions taken for violence and incitement violations on FB and Instagram have also increased over the years. Notably, the highest number of actions for violent and graphic content was observed in 2022 Q2, followed by a gradual decrease in 2023 Q4, then an increase again (fig: 2).

The trends in enforcement activity on Facebook and Instagram reveal a dynamic landscape of efforts to uphold platform standards and ensure user safety. While some violation categories exhibit declining enforcement actions, others show fluctuations or increasing trends, highlighting the evolving nature of content moderation challenges. Continued vigilance and adaptability will be crucial in addressing emerging threats and maintaining a safe online environment for all users.

References

  1. Prevalence. (n.d.). Meta. Retrieved November 3, 2024, from https://transparency.meta.com/en-gb/policies/improving/prevalence-metric/
  2. Community Standards Enforcement Report. (n.d.). Meta. Retrieved November 3, 2024, from https://transparency.meta.com/reports/community-standards-enforcement/
  3. Integrity, G. R. V. O., & Meta. (2021, November 1). Hate speech prevalence has dropped by almost 50% on Facebook. Meta. https://about.fb.com/news/2021/10/hate-speech-prevalence-dropped-facebook/

 



About Author



 

Pankaj Chowdhury is a former Research Assistant at the International Economic Association. He holds a Master’s degree in Demography & Biostatistics from the International Institute for Population Sciences and a Bachelor’s degree in Statistics from Visva-Bharati University. His primary research interests focus on exploring new dimensions of in computational social science and digital demography.

Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the views of 360 Analytika.

Acknowledgement: The author extends his gratitude to the Facebook for providing data support.

This article is posted by Sahil Shekh, Editor at 360 Analytika.

You May Like This