Meta CEO Mark Zuckerberg should focus on US midterms, not the Metaverse
You have to hand it over to Mark Zuckerberg. Faced with criticism of the radical strategic shift he has chosen for Facebook, he persists in making it a metaverse company. Other tech billionaires may lash out at dissent, but Zuckerberg remains stoic, turning away from the noise to give serious interviews and presentations about his vision for virtual reality.
But if he can get away with ignoring criticism, the CEO of Facebook parent company Meta Platforms Inc. is likely to re-evaluate his priorities in the coming months as the United States approaches a potentially midterm election. tumultuous. He must turn his attention back to Facebook, or risk letting misleading videos about voter fraud proliferate, potentially disrupting the democratic process again.
Zuckerberg could start by doing what thousands of managers have done before him, and reconsider his duties.
The Metaverse project is still in its infancy: while Facebook has around 3 billion active users, Horizon Worlds, the VR platform that serves as the basis for the Metaverse experience, only has 200,000, according to internal documents revealed by the Wall Street Journal.
Zuckerberg has been candid in saying that Meta’s metaverse won’t be fully realized for another five years or more. All the more reason, then, that his passion project can afford to lose his attention for a few months, or at least during critical moments for democracy.
So far, he has shown no signs of changing direction. Facebook’s core election team no longer reports directly to Zuckerberg as it did in 2020, according to the New York Times, when Zuckerberg made this year’s U.S. election his top priority.
He also loosened the reins of key leaders tasked with handling election disinformation. Head of global affairs Nick Clegg now splits his time between the UK and Silicon Valley, and Guy Rosen, the company’s head of information security, has moved to Israel, a porter confirmed. word of the company by e-mail.
Researchers who track misinformation on social media say there’s little evidence that Facebook is better at stopping conspiracy theories now than it was in 2020. Melanie Smith, who leads misinformation research at the Institute for Strategic Dialogue, a nonprofit organization It has not improved access to data for outside researchers trying to quantify the spread of misleading publications. Anecdotally, they are still proliferating, she says. Smith said she found Facebook groups recruiting election observers ostensibly to intimidate voters on Election Day.
She also pointed to a video posted by Florida Rep. Matt Gaetz on his Facebook page claiming the 2020 election was stolen. The video has been viewed over 40,000 times at the time of writing. Although it was released a month ago, it does not carry a fact-checking warning label.
Smith also cited recent Facebook posts, shared hundreds of times, inviting people to events to discuss how “Chinese Communists” are running local elections in the United States, or posters saying some politicians should “go to jail for their role in the stolen election. Candidate posts tend to spread particularly far, Smith said.
Facebook’s policy is not to subject politicians to fact-checking, a Meta spokesperson said, although the company makes an exception for specific content that has already been debunked by fact-checkers. Zuckerberg should consider changing this policy and moderating politicians and election candidates with the same rules that govern the rest of us. The researchers say politicians are sometimes big spreaders of disinformation, especially in Asian countries, and current company policy gives them carte blanche to continue spreading misleading content.
Meta said its main approach to managing content midway through 2022 will be with warning labels. But warning labels aren’t very effective either. According to a study by the Integrity Institute, a nonprofit research organization run by former employees of big tech companies. Studies have shown that misinformation gets 90% of its total social media engagement in less than a day.
The problem, ultimately, is how Facebook shows people the content most likely to keep them on the site, what whistleblower Frances Haugen called engagement-based ranking. According to Jeff Allen, a former data scientist at Meta and co-founder of the Integrity Institute, a better approach would be “quality-based ranking,” similar to Google’s page ranking system that favors consistently reliable sources of information. .
Facebook’s growing emphasis on videos is likely to compound the problem. In September 2022, misinformation was shared far more often via video than through regular Facebook posts, Allen said, citing a recent study by the Integrity Institute. Fake content typically gets more engagement than truthful content, he added, and therefore tends to be favored by an engagement-based system.
In 2020, Facebook rolled out “icebreaker” measures to counter a flurry of posts saying the election was stolen by then-President-elect Joe Biden, which ultimately fueled the Capitol storming. American on January 6.
Meta should no longer have to resort to such drastic measures. If Zuckerberg is serious about connecting people and doing so responsibly, he should step out of his virtual reality bubble and re-examine the ranking system that keeps eyeballs glued to Facebook content. At a minimum, he could let his employees and the public know that he is once again making election integrity a priority. The metaverse can wait.