Facebook publishes its “Community Standards Enforcement Report for Q2 of 2021.” The report covers all enforcement and removal actions made by the platform among different content from April to June 2021.
As Facebook appealed for online content governance, it imposed stricter rules on group interaction and updated its content appeals by creating an Oversight Board. But the platform still independently acts out outright removal of harmful, malicious, and untruthful content.
On its “Community Standards Enforcement Report for Q2 of 2021,” Facebook reported removing more than 20 million pieces of content related to violations of its COVID-19 misinformation policies. These include the removal of 3,000 accounts, groups, and pages on Facebook and Instagram. It also covered 190 million warnings against related COVID-19 and vaccines content.
Facebook also reported a significant decline in hate speeches for three quarters in a row. It is now at 5 views per 10,000 views or 0.05% to 0.06%. Although Instagram has experienced a surge in hate speeches in Q2 of 2021. The surge can be attributed to Instagram’s focus on detecting dangerous organizations in the app.
Fake accounts on Facebook also stand at 5% of the total profiles on Facebook. But the platform has noted that its automated detection process is still improving. It has actually removed 1.7 billion fake profiles during Q2 of 2021. Though, it is still a challenge for Facebook to decrease the average number of fake accounts. Moving forward, Facebook is adding 13 more policy areas on Facebook and 11 on Instagram as new metrics for community standards enforcement.
Facebook publishes its “Community Standards Enforcement Report for Q2 of 2021” on 18 August 2021.
Implications for Marketers:
Facebook’s “Community Standards Enforcement Report for Q2 of 2021” is an ongoing move to promote transparency within the platform. These can be helpful insights for marketers to make better decisions on their ads and campaigns within the Facebook family of apps.