Can Social Media Anger Gangs Even More? | CLARENCE PAGE
It has been a terrible week for Facebook.
First, a series of Wall Street Journal surveys report that for years, Facebook has been studying how Instagram, which it owns, has been harmful to young users. Among other bombs, the Journal cited a leaked internal document that said, “We are making body image problems worse for one in three teenage girls.”
Imagine this as part of Facebook’s advertising plan. Not likely.
In her Senate testimony last week, whistleblower and former Facebook employee Frances Haugen cited other internal documents and accused the social media giant of putting “profits before people”, comparing it to tobacco by raising young people to a toxic product “just like cigarettes”.
“They explicitly say, ‘I feel bad when I use Instagram,'” said Haugen, “‘and yet I can’t stop.'”
By the end of the week, Facebook founder and CEO Mark Zuckerberg was back on the cover of Time magazine, but this time with his face partly covered by an image of a smartphone app asking, “Delete Facebook?” “
Facebook has responded to the allegations, as it has done before, with denials or various versions of “We’re working on it.”
If so, I hope they and Congress will also take a closer and broader look and investigate another long-standing but too rarely reported threat encouraged by social media: street gang violence. in cities like Chicago.
The interplay between social media and gang violence has been widely known since at least 2016. It was at this point that Acting Chicago Police Superintendent John Escalante blamed gang conflict for the violence escalated that year and has continued with little relief since.
He described how street strife often arises on social media platforms like Twitter, Facebook and Snapchat, where gang members threaten and laugh at each other, often beefing up to the point where someone is done to shoot on.
As gangbangers and the like find a new way to display their toughness, guns, and even some of their crimes by posting photos online, the police have also learned to track their activities.
The same is true for some university researchers. For example, former University of Chicago sociologist Forrest Stuart, winner of a MacArthur Foundation engineering scholarship in 2020, has become an emerging expert on the subject since joining as a correspondent for two-year war with the Chicago Gangster Disciples while running a school violence prevention program.
Now an Associate Professor at Stanford University, his 2020 book “Ballad of the Bullet: Gangs, Drill Music and the Power of Online Infamy” offers an unbiased look at how violence has been both encouraged and discouraged on social media, depending on who is posting.
Haugen’s testimony was particularly disturbing in his description of how Facebook’s algorithm is programmed to promote the incentive, not the insightful – pushing out the most polarizing and emotionally charged content, regardless of its veracity.
For example, she said, the algorithm selects content based on what you’ve watched in the past. “They optimize hateful, divisive and polarizing content,” she said. “It’s easier to inspire people with anger than with other emotions.”
It is not healthy for people who are already preconditioned to commit acts of violence.
Haugen said Facebook even removed guarantees against such inflammatory content ahead of the 2020 election and then removed the guarantees – contributing, she said, to the Jan.6 insurgency on Capitol Hill.
I hesitate to rush to judge, because there was more than enough in the silly January 6th attack. But the possibility strengthens the case for Congress to investigate Facebook’s algorithm, which Facebook is bound to fight like Col. Sanders would fight over the secret to his fried chicken recipe.
But, as a publicly traded company, Facebook’s veiled power to promote bogus or potentially harmful content is not necessarily protected by the First Amendment.
The debate inevitably centers on section 230 of the Communications Decency Act. Adopted in 1996, it protects websites from prosecution if a user posts something illegal, not to mention exceptions such as copyright infringements, sex work material, and federal criminal law violations. .
It’s no surprise that Zuckerberg was so unenthusiastic about the possibility of such a vulnerability that he also requested an update of Section 230, though his suggestions seemed too selfish to get a warm welcome from Capitol. Hill.
I’m a First Amendment fanatic, but after more than three decades of experience it doesn’t make sense to give internet media more protection than traditional, old-fashioned, and traditional media have. traditionally.
Put Facebook on a shorter leash.
Contact Clarence Page at [email protected]