Virality, algorithms and echo chambers: Can adjusting the feed diminish division online?
Almost three years later, the 2020 presidential election is hardly in the rearview mirror. Big questions remain about how algorithms spread polarizing content on the social media platforms so many Americans turn to for news and information.
For answers, academics across the country have been collaborating with Meta, which owns Facebook and Instagram. The result? Four studies that look at online polarization and ideological segregation among users on both platforms over three months during the 2020 election campaign.
Marketplace’s Lily Jamali spoke with Joshua Tucker of New York University, one of the academics who worked on these reports. Tucker walked her through what he considers the top three findings of the research. The following is an edited transcript of their conversation.
Joshua Tucker: Algorithms are extremely influential in terms of what people see. The second is that we find that there is significant ideological segregation. So we find political news that’s primarily read by liberals, and we find lots of political news that’s primarily or exclusively read by conservatives. The third big-picture finding, though, is that despite these two facts, we find that changing aspects of people’s platform experiences — and these are kind of big aspects of the platform experience that have been positive to have a big impact on political attitudes — things like reducing exposure to virality by not having people exposed to reshared content, things like reducing echo chambers by having people be exposed to less content from politically like-minded sources as well as replacing the engagement-based algorithm that is normally in place on these platforms with a simple reverse chronological feed. None of these changes seem to have much of an effect on political attitudes, including things like issue polarization or affective polarization, which is sort of this idea of how much you dislike the other political party.
Lily Jamali: So on the experiment with moving the Facebook newsfeed to a reverse chronological feed, as you say, the approach didn’t have that much of an impact on political attitudes. Were you surprised by that?
Tucker: We did find, actually, that it had a big impact on what was going on on the platform. And in particular, people who were in reverse chronological feed didn’t use the platforms as much. So people who got put in the reverse chronological feed on Facebook ended up using Instagram more than the people who are getting the regular chronological feed on Facebook. And we did a separate study with people on Instagram. And the people on Instagram who got the reverse chronological feed started using TikTok more. If you think about it, the reason the platforms are using these engagement-driven algorithms is because they’re trying to figure out what’s going to keep people on the platform.
Jamali: Why do you think it is that, if the studies are correct, changing the algorithm didn’t make more of a difference in terms of serving up this kind of polarizing content?
Tucker: The people who were in this reverse chronological feed experiment — so we randomly assigned some people to stay in the normal feed and then some people to get the reverse chronological. And we did this for a three-month period. But we have to caveat that. It’s possible that if we had done this for two years instead of three months, we might have found a different answer to this. It’s possible that if we did this at a different period of time, not during the heart of the U.S. 2020 election, a period of time when people are getting lots of information about the election, they’re getting it from TV, from radio, they’re getting it from their friends and family, not to mention they’re getting it from lots of other platforms, people don’t tend to live only on one social media platform. And in these experiments, we only altered what was going on on one particular platform. The other thing that I think we need to be supercareful about here is to realize what this doesn’t tell us: It cannot tell us what the world would look like without social media. Facebook had been around for many years before the study. So while it was a long period of time to change people’s experiences on the platform, it’s not long enough for us to definitively say what the impact of these platforms writ large are. But what I think it does tell us is that there are not sort of simple solutions out there to complex problems. Like, if we wanted to say, “Could we just change the algorithmic feed for a couple of months before election to dial down tension in the country?” It doesn’t seem like that would work. Or in this case, it doesn’t seem like it did work.
Jamali: So we’ve seen how online polarization can translate to real-world actions. January 6 is the most prominent example, of course, but it’s hardly the only one. What sorts of guardrails do you, as somebody who has been following this closely, think all platforms should have in place to avoid incubating this type of organizing?
Tucker: I think what I’ve learned from this is just the absolute importance of us trying to do this kind of research to really dig into what are the implications and what are the effects of these different features of these platforms, but also what’s going on on these particular platforms? I mean, we’ve learned a ton from these studies. But this was one company with two platforms, Facebook and Instagram, at one moment in time, the U.S. 2020 election, in one country, in the United States. And we’ve already learned a bunch of things that are surprising to us. And we’ve seen some of the complexity and the nuances of these types of things. And so I would love to see that one of the takeaways from this project is that other platforms look at this and say, “Hey, this is research we could do to try to inform the public about what’s going on with our platforms,” or, alternatively, that the public looks at this and says, “Hey, this is great that we now know some of these things about what happened in 2020 on Facebook, but we want to know what’s going to happen in 2024” or people in other countries want to know what’s going on. Or we should know what’s going on on TikTok, or we should know what’s going on on YouTube as well. So that’s kind of what I’ve taken away from this — just the importance of continuing this type of research and being able to try to answer these types of questions in the multitude of different contexts. This, hopefully, will not turn out to have just been a one-shot deal. It’s great that we know this about what happened in the U.S. in 2020, but there’s still so many unanswered questions.
Jamali: A cynical person might hear you say that and think, “Is the conclusion here, well, not much we can do.” How do you respond to somebody who might think that?
Tucker: Yeah, I mean, the focus here on these particular papers that have come out right now, and there are going to be more papers coming out from the project, there are still more that are in progress now. They’re designed to increase our scientific understanding and to try to get at these concepts like virality, and the algorithms. And to get at these questions of echo chambers and things like that. What we have done, I think, is increase the knowledge base. We have more information that we can then compare with what we’ve learned from other studies as well. We’ve shown a path forward for how we can get more information in the future. And I do think what we’ve learned is that there, so far, at least, there do not seem to be any sort of simple answers to these very complex questions.
Jamali: When you look at your work, and also the totality of all of these many, I think it’s 17, studies, do you feel like this research owes a silver bullet answer to the public, given the events of the last few years?
Tucker: I mean, I’m with you. It would have been great if we found some silver bullet, if we came out of this and we said, “Do X, like, that’ll fix everything.” I think the best thing we can do to the public is get policymakers the highest-quality knowledge about what is happening on these platforms, and what are the implications of what’s happening on these platforms, so that they can make policy decisions that are grounded in fact and understanding. The danger is you go for policies that sound like they’re going to be really beneficial, but then they end up having all sorts of unintended consequences. And that’s what we’re trying to understand here.
During their conversation, Marketplace’s Lily Jamali also asked Joshua Tucker about the integrity of the conclusions, given that academics like him collaborated with Meta, the biggest social media platform owner. He said the company gave them full autonomy over what was published. The company’s legal department did review the papers to make sure they didn’t violate legal obligations linked to protecting user privacy. So far, Tucker said, Meta hasn’t flagged any violations.
Meta, for its part, said the findings will “be hugely valuable to us, and we hope they will also help policymakers as they shape the rules of the road for the internet — for the benefit of our democracy, and society as a whole.”
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.
Support “Marketplace Tech” in any amount today and become a partner in our mission.