Support the fact-based journalism you rely on with a donation to Marketplace today. Give Now!
How disinformation on YouTube gets into your “watch next” queue
Feb 7, 2020

How disinformation on YouTube gets into your “watch next” queue

HTML EMBED:
COPY
The video platform is resisting policing content, but now companies are clamoring, too.

An in-depth report last month looked at climate disinformation online and found that YouTube was spreading videos through its recommendation, or watch next, algorithms. 

“The science is not settled. The debate is not over. The climate is changing — it always has, it always will,” one says. Many of these videos are slick and professionally produced, and they can seem quite credible. Not only are these videos popping up to the top of recommended queues of YouTube’s billions of users, but they’re often wrapped in regular ads from big-name companies that are unwittingly funding this disinformation. 

I spoke with David Roberts of Vox about the problems YouTube faces in deciding how to police these videos. The following is an edited transcript of our conversation.

David Roberts: It’s a really difficult dilemma YouTube is in. It’s entirely possible that YouTube will submit to this pressure and go remove these particular videos. But once it opens the door to saying, “If you identify something incorrect on our site, we’ll take it down” — there are billions of videos floating out there, and YouTube doesn’t want to get pulled into a cycle where it’s devoting more and more resources to policing what is and isn’t true and what is and isn’t OK and decent. It’s really hard to say how it’s all going to shake out.

Jack Stewart: How much control does YouTube have over these videos and over the ads that appear next to them?

Roberts: On one sense, if you take a step back, obviously, it has all the control in the world. It’s a private company, it’s a private business model, and it can do whatever it wants. So there are two questions: How much work it wants to put in, and what kind of stands does it want to take and what controversies does it want to get involved in? It absolutely could go through videos that are about climate change and individually assess them. But there’s no way for an algorithm to tell whether someone is talking about CO2 sensibly. It’s a manual thing. That’s part of, I think, why YouTube has resisted getting pulled into these controversies, because there’s no way around just some poor human being going and watching them all and picking out which ones are true from false.

Stewart: Is it exclusively a problem that is related to climate change and misinformation around that, or have we seen it, say, with vaccine and anti-vax videos?

Roberts: You’ll find the most misinformation around topics that are the most heated and controversial and closest to the center of America’s ongoing culture war. I would even say, in terms of volume, climate misinformation is probably a relatively low percentage. There’s probably a lot more about vaccines, which people care about, or about the wall on the border or how many votes Trump got — all these things, which are much more current and heated. This is an absolutely ubiquitous, systemic problem for YouTube. It’s not some unique problem just facing climate videos.

Stewart: What about the companies that are having their ads placed next to some of these videos that may not align with their corporate beliefs? Why aren’t they clamoring for something to be done?

Roberts: They are clamoring a little bit. When they found out, it was absolutely news to them that their ads are appearing in these videos. They are now clamoring for YouTube to take their ads out of these particular videos. But this just runs into the same problem, which is these two contrasting incentives: The companies want their ads to play as widely as possible, and they don’t want to get in the business of individually approving each video that their ad goes into. That’s just an endless swamp of work, and same for YouTube. There’s got to be some balance here between algorithmic — just allowing the algorithm to do it — and then occasionally using human judgment in particularly egregious cases. No one knows where that line is. If I really wanted to put my mind to it, I’m sure I could take a Johnson & Johnson ad and track down everywhere it has appeared on YouTube, and I could find a lot more than climate misinformation. I could find plenty of other offensive or unpleasant videos I’m sure that it’s appeared in, because the algorithm is just throwing it everywhere. Johnson & Johnson, I’m sure, would object to a lot of those, but what’s the final answer here? There’s no real final answer. There’s just this push and pull between the automatic algorithms and someone using human judgment.

Climate activists protest the hosting of climate denial videos outside of YouTube’s London office in October. (Paul Ellis/AFP via Getty Images)

Related links: More insight from Jack Stewart

David Roberts went into some detail about why we all seem so susceptible to mis- and disinformation, no matter how smart we think we are. He went deep into the psychology of fast and slow thinking, or gut reactions versus reasoning. He also wrote about the difficult position that social media platforms find themselves in — they don’t want to be perceived as biased, and they don’t want to be the gatekeepers deciding what is information and what is falsehood. 

The original Avaaz report, from the self-proclaimed global web movement to empower people to take action on pressing issues, looked at climate disinformation.

A letter from U.S. Rep. Kathy Castor, a Democrat from Florida, calls on YouTube to stop including any climate misinformation and disinformation in its recommendation algorithms. She is chair of the U.S. House Select Committee on the Climate Crisis, and she’s asking the platform to label these videos as “borderline content,” something YouTube has been able to do before with anti-vaccine videos, which stops them being monetized with ads from companies that many don’t want to be associated with. Also, the letter asks for a reply from Sundar Pichai, CEO of Alphabet and Google (YouTube’s owner), by Friday, so, we’ll see.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Molly Wood Host
Stephanie Hughes Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer