Facebook’s vaccination position is part of a familiar pattern, according to author and NYTimes reporter – TechCrunch
Today, in a new report on “coordinated inauthentic behavior” on its platform, Facebook says it deleted hundreds of accounts on its Facebook and Instagram platforms last month that were linked to anti-disinformation campaigns. vaccination carried out from Russia. In one campaign, the company says, a newly banned network “posted memes and comments claiming the AstraZeneca COVID-19 vaccine would turn people into chimpanzees.” More recently, in May, the same network “questioned the safety of the Pfizer vaccine by publishing an allegedly hacked and leaked
AstraZeneca document, âexplains Facebook.
The company issues such reports to remind the public that it is focused on “finding and removing deceptive campaigns around the world.” Yet a new New York Times investigation into Facebook’s relationship with the Biden administration suggests the company continues to fail when it comes to tackling disinformation, including, currently, around misinformation about vaccines.
We discussed this reported disconnect earlier today with Sheera Frenkel, cybersecurity correspondent for the New York Times and recent co-author, with New York Times national correspondent Cecelia Kang, of “An Ugly Truth: Inside Facebook’s. Battle for Domination, âwhich was released in June. Our conversation has been changed slightly.
TC: This big story right now on Facebook centers around closing the accounts of NYU researchers whose tools to study network advertising violated its rules, according to the company. Many people think that these objections do not hold water. In the meantime, several Democratic senators have sent a letter to the company, informing them of its decision to ban these academics. How does this particular situation fit into your understanding of how Facebook works?
SF: I was struck by how it matched a pattern that we really showed in [our] Facebook book taking what appears to be a very ad hoc and piecemeal approach to many of its problems. This action they took against NYU was surprising because there are so many others who use data the same way as NYU, including private companies and commercial companies who use it in ways we don’t. not fully understand.
With NYU, academics were actually pretty transparent about how they collected data. They didn’t hide what they were doing. They told reporters about it and they told Facebook about it. So for Facebook to take action against them, just as they were about to post research that could have criticized Facebook and might have been damaging Facebook, it seems like a unique thing and really goes to the root. Facebook’s issues with what data the company holds on its own users.
TC: Do you have the feeling that Senate or Congress investigators might demand more responsibility for more recent industry indiscretions, such as the events of January 6? Usually, there comes a time when Facebook apologizes for a public failure. . . then nothing changes.
SF: After the book came out, I spoke to a lawmaker who read our book and said, âIt’s one thing if they apologize once, and we’ve seen a substantial change happen. in the business. But what these excuses show us is that they think they can get away with just making an apology and then changing some really superficial things but not getting to the root of the problem.
So you brought up on January 6, what we know Congress is looking at, and I think what lawmakers are doing goes beyond what they usually do. . . they take a step back and say, “How did Facebook allow groups to foment on the platform for months before January 6?” How did his algorithms lead people to these groups? And how her piecemeal approach to eliminating some groups but not others really allowed this movement known as stopping the flight to take off. It’s fascinating because, until now, they haven’t taken that step back to understand all the machinery behind Facebook.
TC: Still, if Facebook isn’t willing to share its data in a more granular way, I wonder how successful these surveys will really be.
SF: We reported in the New York Times that Facebook, when asked by the White House for this COVID prevalence data – the idea being how widespread COVID misinformation is – could not give it to the White House because they didn’t have them. And the reason they didn’t have it is that when their own data scientists wanted to start tracking this over a year ago at the start of the pandemic, Facebook didn’t give them the resources or the mandate. to start tracking the prevalence of COVID misinformation. . One thing lawmakers can do is pressure Facebook to do so in the future and give the company firm deadlines for when they want to see this data.
TC: Based on your reports, do you think there is a reporting problem within Facebook or that these unclosed information loops are intentional? In the book, for example, you talk about Russian activity on the platform leading up to the 2016 election. You say that the company’s security officer at the time, Alex Stamos, had set up a task force to examine Russian electoral interference relatively early in 2016, but after Donald Trump won the election Mark Zuckerberg and Sheryl Sandberg said they had no idea and frustrated and didn’t know why we hadn’t presented Stamos’ findings to them earlier.
SF: While we were reporting for this book, we really wanted to get to the bottom of it. Did Mark Zuckerberg and Sheryl Sandberg shy away from knowing what there was to know about Russia, or were they just kept on the sidelines? Ultimately, I think only Mark Zuckerberg or Sheryl Sandberg can answer this question.
What I will say is that at first, about a week or two after the 2016 election, Alex Stamos goes to them and says, âThere has been Russian electoral interference. We don’t know how many; we don’t know the extent. But there was definitely something here and we want to investigate it. And even after hearing this surprising news, Mark Zuckerberg [and other to brass] did not request daily or even weekly meetings to keep abreast of the security team’s progress. I know he’s the CEO of a company and as a CEO [he has] a lot on [his] plate. But you would think if your security team told you, âHey, there was something unprecedented happening on our platform. Democracy has potentially been affected in a way that we hadn’t foreseen or expected. I will ask for updates and regular meetings on this. We don’t see that happening. And that allows them to say every month, âWell, we didn’t know. We weren’t fully aware of it.
TC: Until then, industry participants remain very interested in the direction of regulation. What do you see more closely?
SF: Over the next six months to a year, there are two things that fascinate me. One is COVID misinformation. This is the worst problem for Facebook, as it has been developing on the platform for almost a decade. It has deep roots in all parts of Facebook. And it’s homegrown. It is the Americans who are spreading this misinformation to other Americans. So it calls into question all of Facebook’s principles of free speech and what it means to be a platform that embraces free speech, but also hasn’t drawn a clear line between what freedom of speech and what harmful speech is, especially during the pandemic period. So I’m really curious to see how they deal with the fact that their own algorithms are still pushing people into anti-vaccine groups and continuing to promote people out of the platform spreading incorrect information about COVID.
The second thing for me is that we are entering a year where there will be a lot of really important elections in other countries with populist leaders, some of whom are modeling their use of Facebook after Donald Trump. After banning Donald Trump. I’m very curious to see how Facebook treats some of these leaders in other countries who test the waters the same way he does.