Facebook attributes automated improvements to finding violent content
Facebook’s parent company said in a report on Tuesday that it took action against 13.6 million pieces of content that represented or incited violence on the platform during the third quarter of the year.
Meta, the newly renamed company that includes Facebook, Instagram, WhatsApp and Oculus, has taken similar action against more than 3 million instances of content on Instagram, the company said in its Community Standards Enforcement report.
Facebook takes various actions against content that violates its policies, including removing content, adding warnings before content can be viewed, and deactivating accounts. The report did not specify how many of those cases were referrals, and it did not separate those cases by language or geographic region.
Tuesday’s announcement marks the first time Meta has released figures relating to violence. It comes about two weeks after numerous news outlets, including NBC News, extensively reported leaked documents that detailed Facebook’s challenges in dealing with extremism on its platform and the growing dissatisfaction of some of its employees. .
The company said Facebook’s automated systems capture more than 96% of violent content before it is reported by users. The Meta report also found that violent content made up 0.04% of content viewed on Facebook. The latest report covers June through August.
“This is our 11th report, which shares more data and information than any of our industry peers,” Guy Rosen, Facebook vice president of integrity, said at a call with journalists.
Meta also posted figures for the first time on the prevalence of ‘bullying and harassing’ content on Facebook, saying it represented about 0.14% of content viewed in the third quarter.
Meta officials said the company’s internal algorithm-based tools for finding bullying content before users have improved over the past two years.
Additionally, Meta has also continued to fight against militia and QAnon-related material on Facebook since August 2020, and has provided new numbers related to this content for the first time since January.
Monika Bickert, head of global policy management at Facebook, said that to date the company has removed nearly 55,000 profiles of “militarized social movements”, more than double the number published by the company it 10 months ago.
Likewise, it removed more than 50,000 QAnon-related profiles, a figure almost three times the figure of 18,300 released in January.