Facebook is broken beyond repair

It’s hard to think of a more damning determination: Facebook’s product inevitably led to the spread of hate speech and disinformation. But this conclusion is inescapable when you look at other results. Election disinformation continued to spread and proliferate rapidly in the aftermath of the 2020 elections; a data scientist warned that 10% of content viewed following the election was widespread fraud. They found that Facebook would recommend QAnon content to users who were simply showing interest in conservative topics within days. “The body of research has consistently revealed that Facebook is pushing some users into ‘rabbit holes’, increasingly narrow echo chambers where violent conspiracy theories thrive,” NBC News revealed. “The people radicalized through these burrows are only a small slice of the total number of users, but on a Facebook scale that can mean millions of individuals. “
The documents also revealed that Facebook’s efforts to prevent the spread of anti-vax misinformation were often critically flawed, and the company was slow to understand how dismal its response was – a particularly shocking revelation given that the five The past few years have consistently and repeatedly demonstrated that the platform is overwhelmed by disinformation with an ease that suggests it was designed to do just that.
The situation in the rest of the world, meanwhile, is worse than in America. A handful of countries, the United States being one, receive special attention when it comes to content moderation. While these moderation efforts are often insufficient, it’s significantly better than most people get. Researchers have found that Facebook has been used to disseminate everything from hate speech to ethnic cleansing. Mark Zuckerberg, meanwhile, has spoken out on behalf of authoritarian governments: Given the choice between helping Vietnam’s autocratic government censorship posts and shutting down doing business in the country, he personally chose to go the first route. Over and over again, you see Facebook making this choice, in different ways: The company always chooses profit and growth, even when that choice clearly sows discord, spreads misinformation about inciting violence, or makes the world worse.