Zuckerberg created a monster that attacks our prejudices
An Associated Press article last week reported the use by German neo-Nazi groups of Facebook, Instagram and YouTube to “spread their ideology, attract recruits and make money through ticket sales and of branded products â.
The prospect of a rebirth of Nazism in Germany, its birthplace, is quite nightmarish. The worst part is that this is happening with the help of American social media. But as I just finished reading “An Ugly Truth” – a recently published account of Facebook’s rise from a Harvard College dorm project in 2004 to a nearly trillion dollar social media conglomerate. with almost 3 billion users – can’t say I’m surprised.
The history of Facebook mirrors that of Frankenstein. Mary Shelley’s 1818 novel about a young scientist’s obsessive quest to create a humanoid from non-living matter strangely anticipated the reckless pride in Facebook founder Mark Zuckerberg‘s mission to connect every human being on the planet. planet to a sort of virtual global fraternity house. In each, the monster is out of the creator’s control.
To understand the dangers of Facebook, it is necessary to know a little about human psychology and how the platform manipulates this psychology.
People have many irrational impulses, often existing on an unconscious or subconscious level.
Studies have shown, for example, that the brain is wired for a number of biases that interfere with our perception of reality. These include the tendency of people to estimate the probability of an event by the ease with which they can remember examples, the pessimistic belief that the future is likely to be worse than the present, and the tendency to interpret situations in a way that supports perspectives. of the social groups to which they belong, as individuals.
As a hypothetical example of the three prejudices working in tandem, suppose there is extensive television coverage of a fatal plane crash of unknown cause that results in hundreds of deaths. The coverage could lead some viewers to believe that commercial air travel is extremely dangerous, that lax FAA inspection practices make it increasingly dangerous, and that permissive immigration policies give foreign terrorists easy access to boarding and sabotage of flights. Every belief would be wrong.
Zuckerberg’s brilliant idea behind Facebook was that people could be persuaded to share huge amounts of personal information in order to retain or expand their circle of friends through free online access to a virtual social network. The network allowed users to message each other and share topics of interest on websites covering everything from politics to sports, cooking, traveling, dating, child rearing. and gardening. The large user base, coupled with the technical innovation of algorithmic amplification, has made it a lucrative targeted advertising business.
Artificial intelligence algorithms monitor the frequency, duration and content of users’ âclicksâ, not only on user pages, but on the various websites they and their virtual friends âlikeâ. This provides a measure of a user’s interests, allowing Facebook to provide everyone with a daily menu of ânews feedsâ tailored to the user’s interests and invite the user to join virtual Facebook âgroupsâ. that match those interests. Such offers amplify engagement and increase the overall time users spend on the platform.
Given the biases of the human psyche, Algorithmic Amplification prompts many users to migrate to sites and groups that reinforce their natural tendency for irrational thinking, pessimism, and social self-segregation – in others. terms, to enter the dark, paranoid echo chambers of alternate reality.
Zuckerberg started Facebook with the best of intentions. But his steadfast pursuit of user growth and the ad revenue that came with it caused him to turn a blind eye to the ways Facebook promoted violent extremism, distorted the psyche of its users, endangered national security, and destroyed the essential institutions of our democracy.
Holding a controlling stake in a monopoly company virtually uncontrolled by government regulation, Zuckerberg has repeatedly ignored internal and external warnings about the collateral damage Facebook was causing. He only reacted when the media exposed him, congressional hearings and public outrage forced him to do so and only with half-hearted measures to remove the most offensive posts through some sort of internal censorship. known as âcontent moderationâ.
The spread of neo-Nazism, along with other extremist white supremacist, conspiracy, and anti-government groups spewing out disinformation and inflammatory calls for hatred and violence, is just one of the plagues that Facebook facilitated. Others include:
â¢ The damage that social media harassment has inflicted on adolescent mental health;
â¢ The viral spread of ethnic hate speech in Myanmar, which has escalated into a genocidal campaign of murder and forced migration against the Rohingya Muslim minority;
â¢ Russian interference in the 2016 presidential campaign;
â¢ The dissemination by political agents of manifestly false information defaming elected officials and candidates for political office;
â¢ The spread of disinformation on Facebook regarding public health recommendations to stem the spread of COVID-19; and
â¢ Donald Trump’s use of Facebook to promote anti-immigrant hatred, racism, political extremism, mistrust of electoral integrity and finally, on January 6, 2021, the political insurgency.
The time has come, and indeed it is high time, for thoughtful government regulation of Facebook and other digital social media platforms, which has shown they cannot be trusted to self-regulate.
Public health and safety, national security, public order and the very survival of democracy may depend on it.
Elliott Epstein is a litigator with Andrucki & King in Lewiston. His Rearview Mirror column, published in the Sun Journal for 15 years, analyzes current events in a historical context. He is also the author of “Lucifer’s Child”, a book on the notorious murder of Angela Palmer in 1984. He can be contacted at [emailÂ protected]