Facebook avoids all responsibility with a new tool “experts” in groups
- Facebook will allow people in groups to become experts, whose posts will be amplified.
- The idea is to appoint “knowledgeable experts” who can be authoritative.
- But the move is another way for Facebook to sidestep responsibility for what is posted on its site.
- See more stories on the Insider business page.
Facebook is once again hiring someone else to do its job.
The company said this week that it would let administrators in its groups name users as “experts.”
This means that the people who run the online communities – who include everything from housing options to dogs to extremists – will choose people within those groups to become an authoritative voice that others can count on, an initiative designed in part to eliminate disinformation.
So, for example, an administrator of a group devoted to conspiracy theories or white supremacy might designate someone within that space as an expert, whose messages will be amplified and whom others would trust.
The new decision recalls the creation by Facebook of the supervisory board, often called its âsupreme courtâ. As the company faced increasing pressure to be rigorous in moderating potentially dangerous messages, it failed to fully shoulder its responsibilities.
Instead, he invested $ 130 million in appointing a group of people outside the company to review Facebook’s decisions.
That decision backfired in May, when the group referred a case to Facebook, telling it to do its own job, stop being “lazy” and make its own rules.
If reprimands from the supervisory board are any indication, Facebook’s new âexpertâ tool could also produce a less than productive result.
And not only that, it could have negative consequences if the platform empowers some users of online pockets devoted to disinformation-generating topics.
The groups were originally designed to fight disinformation
Pro-Trump protesters gather in front of the U.S. Capitol on January 6, 2021 in Washington, DC.
Brent Stirton / Getty Images
Facebook focused more on groups after the 2016 presidential election, when the company really started to react against the way disinformation spreads on its platform.
CEO Mark Zuckerberg therefore attempted to divert attention from Facebook’s news feed and its groups to “helping connect a billion people with meaningful communities.”
But problems arose because of the feature, and Facebook removed some groups it said risked inciting violence.
A group âStop the Stealâ, for example, was created in November. 365,000 members joined who were convinced that the 2020 presidential election had been stolen from former President Donald Trump. But just two days later, Facebook deleted it because it said the group was organized around “delegitimizing the electoral process” and that some members were making “worrying calls for violence”.
And before that, in mid-2019, ProPublica reported on a private group of 9,500 current and former border patrol officers who joked about immigrant deaths and made rude comments about Rep. Alexandria Ocasio Cortez.
Facebook said in its announcement that there are more than 70 million admins and moderators who manage active groups around the world.
Moderating so much content is no easy task, but Facebook’s expert tool shows that it still hasn’t found a responsible solution to how false facts can spread like weed on his site.
Facebook did not respond to a request for comment.