No, social media isn’t biased against conservatives
The CEOs of Google, Facebook and Twitter testified Wednesday at a Senate hearing that was supposed to be about a fundamentally important internet law called Section 230, which is about liability protection for tech companies. Instead, the hearing ended up being about how much power big tech companies hold and how they wield it. Republicans accused social media platforms of anti-conservative bias, and Democrats said the platforms aren’t doing enough to curb the spread of disinformation.
I spoke with Marketplace Tech’s Molly Wood about this, and I asked her: Is it true that tech platforms have an anti-conservative bias? The following is an edited transcript of our conversation.
Molly Wood: There is no evidence to show that platforms have an anti-conservative bias. And in fact, there is research as of this very week that says that Facebook actually promotes right-wing content to the point that it is its own sort of right-wing echo chamber. And in fact, Kevin Roose at The New York Times set up basically an automated system that every single day posts the top-performing links on U.S. Facebook pages in the last 24 hours, and the vast majority are from right-wing news sources.
Sabri Ben-Achour: It does, though, stem from an argument that I’ve heard, not just from the right, but also from the left, that these companies do wield a lot of power, they can take whatever they want down, and that people want to know why when it happens, that they should be more transparent with their moderation policies and processes. How much common ground is there on that?
Wood: I mean, that is absolutely true. These companies should be more transparent about how their processes work. And unfortunately, because they have not been, over and over and over, it’s basically opened the door to the, frankly, posturing and performance that you see every time these CEOs appear before Congress. And it’s only gotten worse. I think that that would be, in some ways, a fairly easy fix. “These are our rules. They are enforced consistently and with regularity.” And that would end a lot of these arguments. But unfortunately, the platforms so far have not done that. Now, in the actual last days before a presidential election, they’re kind of scrambling in front of all of us in real time to figure out solutions. And it’s obviously not working. No one’s happy with it.
Ben-Achour: One of the other issues we heard about, brought up mostly by Democrats, was about disinformation, specifically about how much responsibility tech platforms have to take it down. What did we learn?
Wood: Once again, we didn’t learn much from all of the shouting. But I think what we’re learning is that, increasingly, this conversation — which a lot of people believe is about Section 230 and this liability shield that tech platforms have related to what their users post — is not actually what this is about. It’s about disinformation. And so far, we’ve seen proposals from the GOP to alter the way that Section 230 works that would arguably punish platforms for not allowing disinformation to spread. And then, on the other side of the equation, you have, certainly Democrats but also researchers, academics, journalists saying, “These companies have allowed disinformation to spread to a dangerous degree. And they need to do more to stop it on both the advertising side and the content side.”
Here’s the thing that is a little bit tricky to discuss. A lot of the time when conservatives in Congress say that conservative content is being censored, the content that they’re referring to is disinformation. They give frequently the example of the commentators Diamond and Silk, who have had their content taken down because it promoted misinformation around the coronavirus, as one example. So we find ourselves in this tricky position where, increasingly, there are people in the GOP who are embracing QAnon. In fact, just this week, Stephen Miller, the White House political adviser, reportedly told journalists that a Biden administration would enable widespread sex trafficking of children. That is a QAnon talking point. It is a conspiracy theory that has spread on the internet. Because we’re in this moment where we’re right before the election, where disinformation has played a huge role in the 2016 election and in elections since, it is really impossible to separate the politics from the conversation around speech, particularly when so many of these talking points are, in fact, themselves misinformation or disinformation. Researchers just released evidence that President Trump himself is the single largest source of misinformation around COVID-19 and the coronavirus. And that makes this a tricky conversation to have because it feels partisan. But if you agree that truth is truth, and certain things are true and certain things aren’t, then it is sort of a conversation that has to be had.
Ben-Achour: Where do you see regulation of these platforms going, given the wildly different concerns expressed in these hearings from the different sides?
Wood: I see regulation likely going nowhere because it is such a partisan issue at this point, and frankly because there are a lot of things in line ahead of Section 230 reform, like coronavirus relief and health care and any of the other things that are going to be on the political agenda in the next few weeks and months. I also think it’s possible that reforming Section 230 is exactly the wrong approach to the problems that we have with disinformation and misinformation, because those problems ultimately come down to not what is being said online, but how that information is being amplified by algorithms within these social media platforms. And I would actually suggest that the regulation needs to be around the promotion, the technology involved in amplifying these messages, instead of the content of the messages themselves. Then, you stop talking about speech and you start talking about technology powered by humans that is, in fact, pushing a bad product, a dangerous product onto consumers.
Related links: More insight from Sabri Ben-Achour
As much as we are talking about disinformation on Facebook and Twitter, The Washington Post is reporting that disinformation is increasingly coming through other channels, like your inbox or texts. Examples include a misleading video making the totally false claim that Joe Biden had endorsed giving 8- and 10-year-olds sex-change treatments and texts saying that recipients must vote for Trump “or else.” Nowhere is safe from misinformation.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.