Support the fact-based journalism you rely on with a donation to Marketplace today. Give Now!
Internet freedom takes a hit during global elections, report finds
Nov 4, 2024

Internet freedom takes a hit during global elections, report finds

HTML EMBED:
COPY
Allie Funk, who leads the technology and democracy initiative at the nonprofit Freedom House, says many governments suppressed online speech and information while holding and/or preparing for elections.

In case you forgot, we’ve got Election Day tomorrow. But it was also a big year for elections in the rest of the world. About half of the global population is voting in national elections in 2024, and in many countries people have encountered shut down internet, blocked websites or manipulated content online, according to a recent report from the nonprofit Freedom House.

Allie Funk leads Freedom House’s technology and democracy initiative, and she told Marketplace’s Meghan McCarty Carino this is the 14th consecutive year the report has documented a decline in human rights online. The following is an edited transcript of their conversation:

Allie Funk: We looked at three different ways that governments are trying to control the information space around an election. First, is censoring online information, blocking independent news websites, shutting down social media platforms. A concerning example is from Pakistan, in which during rallies by a political party that was trying to come back to power, the government, which the military has a lot of control over there and didn’t like this party, just shut off the internet altogether to try to limit this party’s ability to reach supporters. The second category we looked at was the spread of false and misleading information, so coordinated disinformation campaigns. These campaigns in so many countries that we looked at really tried to go after the integrity of the election itself to try to sow doubt that it was a free and fair vote. And then the last category of issues we raised is governments actually attacking and harassing independent fact checkers and researchers that are trying to study the information space and raise awareness about false and misleading information.

Meghan McCarty Carino: How did the U.S. fare in your analysis?

Funk: So the United States is free. They rank ninth out of our 72 countries. So a free online environment, there isn’t a lot of technical censorship, people’s free expression is largely protected. But we highlight some really concerning trends, particularly around the ways in which false and misleading information have really just undermined people’s trust in the democratic system here and drove so much online harassment against election officials, people working to make our elections the strong and free and fair vote that it is. And we also highlighted some concerns around a new law that actually forces Bytedance, Tiktok’s parent company, to divest its U.S. operations. We do raise concerns about whether this law will lead to the restriction of the platform for so many Americans that use it.

McCarty Carino: Your report also highlights some positive developments, places like South Africa, the European Union, Taiwan, either how they prepared for elections or held elections. You called it promising models. Can you explain what they did right?

Funk: So we want to give people insight into how to actually protect the information space. There’s a lot of just bad content out there. And in so many countries, governments respond to that by using censorship or other disproportionate means. But in Taiwan, in South Africa, throughout the European Union, we found some really innovative models that sort of protected internet freedom while still grappling with some of this terrible content and the ways in which the internet can harm people’s lives. Taiwan is a great example, because the country has a really creative fact checking process. So there’s an independent Cofact platform that is run by volunteer community members, so some journalists who do this professionally, but also just your average mom and pop that might want to get involved. And because the fact checking system is so inclusive in who is involved in it, it’s allowed for the public to really trust the group’s findings. And another interesting thing the government did is they passed a new law that allows political candidates to request that social media companies remove deepfakes. So it’s a really proportionate measure to tackle some of the concerns around generative AI.

The last thing I’ll just mention, in South Africa, you had the election commission work really closely with a local civil society group to review pieces of content that voters flagged that they thought was false, misleading, harassment. They would review that content, and then if it met a very high standard the election commission could then send that content to platforms to see if it should be removed under the platform’s terms of service. So the election commission didn’t order the content to be removed, but worked really closely with this independent group that had deep expertise in this issue, that allowed for more oversight into dealing with content, because you often just don’t want a government making these decisions on their own.

McCarty Carino: When it comes to the EU, they have this sweeping Digital Services Act (DSA) that is enforced right now, which kind of regulates content moderation on major platforms. How has that played into what you cover?

Funk: So the EU is such a fascinating case; it’s so unique because it has both a deep regulatory expertise. And because of it’s so many different countries, it has a unique power to sort of regulate and compel platforms to do something, which you don’t often see maybe in some other countries where platforms may be less focused on it. And the thing we like about the Digital Services Act is that it really is focused on process and transparency and risk mitigation, so trying to get platforms to be more transparent around when content is removed under terms of service, giving information to civil society to do research on what’s happening online, resourcing internal teams. So it’s focused a lot more on process and structure, rather than just trying to get content to be removed. And we think that’s a better approach. It does actually, when you look at it, it does have content removal powers within it. It allows member states to compel companies to remove speech that is illegal under national law, which we have raised some concerns about, because you could think about in certain countries like Hungary, that is a backsliding democracy, how that could be used to restrict political speech. But overall, we do think that the EU model is promising because it does lean more into giving people more agency to design their own online experiences, to even understand what’s happening online, and we think, we hope, at least, we’re going to learn a lot about how the DSA is implemented currently and over the next few years that might lead to some lessons learned here in the U.S.

McCarty Carino: What are some of the high-level remedies that you suggest to curb internet censorship?

Funk: So first and foremost, we are calling on governments to not block platforms even when there’s really insidious and egregious content on them. We just don’t think that’s a proportionate response to deal with that content because of its disproportionate impact on people and their lives. So instead, we’re really calling for some of the laws like the Digital Services Act where the focus is on giving people more transparency and more choice over the platforms they have access to, and they can choose which ones they want. We also want companies to be more transparent. I think that’s a huge part of this puzzle, is improving companies own standards. And over the past year, two years, we have companies that have actually rolled back transparency mechanisms. Meta shut down CrowdTangle, such an important tool for researchers to study what’s happening on Instagram and Facebook. X put their API behind a huge pay wall, they’ve sued civil society groups that are studying what’s happening on the platform. So, really calling on companies to stop that and to reintroduce those transparency mechanisms, and to also really support local civil society groups that are doing this work.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer