Support the fact-based journalism you rely on with a donation to Marketplace today. Give Now!
As war rages in Ukraine, a battle rages against Russian disinformation
Mar 2, 2022

As war rages in Ukraine, a battle rages against Russian disinformation

HTML EMBED:
COPY
But social media platforms are unprepared, said Courtney Radsch of the University of California, Los Angeles. "I think the content moderation policies are being created on the fly yet again."

Even though the physical fighting is in Ukraine, the battlefields of the global information war are social media platforms, news sites and even app stores.

Since Russia invaded Ukraine, social media users have been sharing images and stories of the conflict. Mixed in with the feed are coordinated disinformation and propaganda campaigns.

This weekend, Facebook and Twitter announced that they removed Russian disinformation networks, which had various accounts set up across other social media platforms.

Courtney Radsch is a fellow at the Institute for Technology, Law and Policy at the University of California, Los Angeles and a senior fellow at the Center for International Governance Innovation.

Courtney Radsch: We have seen Russia perform amazing feats of disinformation in terms of creating new accounts, networks that are comprised of both bots and actual supporters, the creation of narratives that really promote Russian superiority, diminish Ukraine and its president’s ability to defend itself, and lots of videos, including the use of humor to perpetuate pro-Russian narratives.

Kimberly Adams: Can you give an example?

Radsch: Sure. There’s, you know, funny little cat videos where you have a cat and a dog fighting. One is Russia, one is the U.S. And you have videos where they’re purporting to be from Ukraine and they’re actually from a whole different time period or a different country. We’ve seen this same type of disinformation be used in Afghanistan and Syrian wars as well.

Courtney Radsch
Courtney Radsch (Rabih Chamas)

Adams: On which platforms is this misinformation and disinformation the most prevalent?

Radsch: I think the better question would be on which platforms is it not prevalent? I mean, really, Russia has conducted an all-out information offensive, and it’s just blanketing Facebook, Twitter, TikTok, Google with its disinformation and with its propaganda. But let’s remember, not everything is disinformation. Some of it is its, you know, very valid belief that it is in the right from its perspective. And so, you know, what you see is Russian propaganda spreading across social media networks and also demands of companies like the Apple Store or Netflix requiring them to carry Russian propaganda stations and Russian media outlets, which these companies are refusing to do at this point.

Adams: What role are sort of casual internet users playing in this misinformation campaign?

Radsch: Part of social media use and internet use is showing what side you support and getting involved in, you know, kind of these global events where you can become part of it by participating on social media. And so we see that, in many cases, people who want to, say, show solidarity with Ukraine are retweeting or sharing information that is, say, pro-Ukrainian or anti-Russian, but maybe it’s not actually accurate. Because the fact is, is you can’t expect people to go independently verify every video. There’s an entire industry that has evolved, Bellingcat, Amnesty International’s Tech program — all of these entities that are designed to actually forensically investigate whether, you know, videos and images are actually accurately representing what they purport to be. Casual users are not going to do that. The other thing is, they’re not going to fact-check everything. And you can’t expect that a person is going to fact-check something before they tweet it. And I think a lot of cases, people just want to be on the right side, they want to show their support and solidarity, and so they’re going to, you know, engage without messaging. And in fact, research has shown that it doesn’t necessarily matter if something is true or not. People want to engage and show their support for a specific side or another, and so they’re going to engage more with that content.

Adams: How is this campaign affecting the lives of people on the ground in Ukraine?

Radsch: While they are very resilient, they’re also part of this information warfare because they are posting their own content on social media. They’re showing Russian military vehicles needing to be towed away or running out of gas. They are showing their, you know, incredible bravery and standing up against one of the world’s leading militaries. And I think that’s why it’s really important because you can’t just cut off Facebook and Twitter and TikTok from Russia completely. What you want to do is to try to rebalance the power between a very wealthy and sophisticated Russian state operation and the ability of individuals in Ukraine on the ground to counteract that information with actual reporting and perspectives from their experience.

Adams: When you have an actual government that believes it’s in the right coordinating a campaign like this, how does that line up with the tech companies’ content moderation policies?

Radsch: Well, first off, I think the content moderation policies are being created on the fly yet again as they face the fact that tech firms and especially social media platforms are integral to how war is conducted. You know, how should countries like Russia, like Iran, like China be allowed to use open Western social media platforms to perpetuate their propaganda? That is a question they have not adequately grappled with despite many examples of when they should have. We have seen it, Russia’s invasion, coming. Putin had been talking about it for several weeks leading up to it. And I think that what we’re going to see after this conflict is that they’re going to have to develop more proactive policies around content moderation for situations like this.

Adams: Ukraine’s official Twitter account has been communicating with the rest of the world on how to help their effort even by asking for cryptocurrency. What happens to those types of official accounts if Russia does succeed in its efforts to potentially change the government in Ukraine?

Radsch: That is a great question, Kimberly. And that is one that I have been asking the companies since Afghanistan and when we saw the Taliban takeover there and take over wanting to know what happens when a nondemocratic transition of power happens, what happens to those accounts? And it seems like such a minor thing, when we’re talking about war and death and destruction — but it’s not because we know how central propaganda and communication are to the war efforts. And the fact is, we don’t know. And that’s what I’m talking about when we say that they need to have policies in place before situations like this. And I’m quite shocked that despite having been in contact with Twitter, Facebook, TikTok and Google over the weekend and asking this very specific question: “What will happen to Ukrainian official accounts if Russia is successful?” None of them responded. None of them have a policy. None of them have a published policy. And they’re really only prepared for peaceful transitions of power in the United States, where they actually have a policy in place. Why don’t they have an approach after Myanmar, after Afghanistan? This is not a new situation, unfortunately, and yet we see once again tech companies are winging it.

Related links: More insight from Kimberly Adams

To Radsch’s last point there, we reached out to the companies she mentioned: Twitter, Facebook, Google and TikTok.

At the time of this taping, only Facebook responded, linking to its Community Standards Enforcement Report, released Tuesday. In it, the company announced the rollout of additional safety features to keep users in Ukraine and Russia from being targets online.

Facebook also reiterated that while it’s not a government entity, it is working with governments “and responding to their requests to combat disinformation, and harmful propaganda.”

We’ll also link to a tweet from the encrypted messaging app Signal warning Eastern European users that the platform is still up and running and that, no, Signal is not hacked. The company said rumors of the app getting hacked could be “part of a coordinated misinformation campaign meant to encourage people to use less secure alternatives.”

To help avoid getting caught up in a misinformation campaign, the Media Manipulation Casebook website has a helpful factsheet about Russia and Ukraine, pointing out key details about the conflict and what led up to it.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daniel Shin Producer
Jesús Alvarado Associate Producer