It's Discount Week! 🎁 Pick up new Marketplace gear at a discount when you donate today! Get My Gear!
Court upholds block of California law aimed at protecting kids online
Aug 27, 2024

Court upholds block of California law aimed at protecting kids online

HTML EMBED:
COPY
Part of the law likely violates the First Amendment, the court said. But the ruling shows a way forward, says Aaron Mackey of the Electronic Frontier Foundation.

The California Age-Appropriate Design Code Act, passed in 2022, would be among the most sweeping pieces of legislation to protect kids from online harms — if it hadn’t become tangled up in court.

The law has two basic requirements: first, that tech companies analyze and report on whether their products are harmful for children; second, that they minimize how much data they collect from those under 18.

Earlier this month, a federal appeals court found that first part likely violates the First Amendment and upheld a lower-court decision blocking that part of the law. But it vacated an injunction on the second component, the part dealing with data privacy.

The decision could point a way forward for similar laws, many of which have also run into legal challenges, said Aaron Mackey, free speech and transparency litigation director at the Electronic Frontier Foundation. The following is an edited transcript of his conversation with Marketplace’s Meghan McCarty Carino.

Aaron Mackey: This was a good win for people who are both concerned about online speech and the ability for everyone to both be able to speak online as well as to access news, information, community groups, those sorts of things online. But also, really good news for people who care about consumer privacy. With sort of one notable exception, the Ninth Circuit really did a great job of focusing in on the problems with the law and this issue with requiring services to inventory their content and then make choices, and recognized that that was inconsistent with the First Amendment, and basically said that that aspect of the law can’t be enforced. But while doing so, what they did is they really set out a path for how lawmakers across the country, not just in California, can draft laws that are comprehensive and give consumer privacy to everyone — but in this case, it was consumer privacy to kids — and showed that there’s a way to do that, there’s a path to do that that is fully consistent with the First Amendment and is not going to run into traps.

Meghan McCarty Carino: So this judgment sort of knocked down some parts of the law while preserving some parts of the law. Can you break down the bifurcation there?

Mackey: So what the court said is that the data privacy impact assessment (that was the inventory and reporting requirement on material that might be harmful to minors), that part of the law was likely to violate the First Amendment. But then, what the court said is, “We’re going to not opine on the rest of the law. There are other parts of this law, again, these data privacy provisions, a provision that requires age verification or age estimation, and we’re going to send that back to the District Court to sort out.” So they basically said they didn’t have enough information from the parties or the District Court to really, actually rule on those provisions, but said at this preliminary stage, we can sort of say this one provision is problematic. And so it sent it back to the District Court to do more work, while recognizing that some aspects of it, particularly the privacy provisions, are unlikely to raise the same First Amendment concerns that the impact assessment provisions did.

McCarty Carino: So there’s going to be some more legal wrangling over some of those issues. But what does this mean for the law moving forward?

Mackey: I would say that what it means is that for the core of the law, it’s going to remain unenforced. So the preliminary injunction remains in effect with regard to the requirement that online services sort of inventory their content and then make choices about certain content and whether or not they can show that content to minors. So, news websites won’t have to decide if reporting about everything from, say, COVID to war to other harmful topics like addiction and things like that, if that information might be harmful to minors. And [it] won’t have to make choices about whether to show that content to certain users. And the same thing is true for online services like Facebook or other social media services that host user-generated content; they won’t have to make that choice. But the other provisions of the law, in theory, can be enforced while the District Court goes back and tries to figure out whether or not they raise any legal concerns that would require them to be blocked.

McCarty Carino: So the California law is among the most sweeping, but there are plenty of other examples — more than a dozen other states that have passed or are looking into passing laws attempting to regulate the internet in some way in order to keep kids safe. And many of those laws have also run into legal trouble. Do you see some recurring themes here?

Mackey: Yeah, the recurring theme is that what lawmakers have been trying to do for probably the past two to three years is they’ve been trying to combine content restrictions and content-blocking laws with privacy legislation that is directed at children. And what the courts are consistently saying is that if you have a law that restricts content, by and large those courts have ruled that that’s going to violate the First Amendment, and they’ve blocked them. And so one of the things that EFF has said, both in courts and to lawmakers is why don’t we try privacy, why don’t we try to make sure that we have comprehensive consumer data privacy for the residents of your state? And we can do that in a way that provides privacy protections to everyone, not just children, and we can do that in a way that’s consistent with the First Amendment.

But the theme that these courts are telling these lawmakers, is lawmakers have so far resisted that, and instead say, “Well, what we’re going to do is we’re going to try to block children from accessing certain content that amorphously or vaguely is harmful to them, and then we’re going to provide some privacy protections in the meantime.” And the courts have largely said, “Stop trying to pass these laws that block content.” And the Ninth Circuit has most recently said, “Go ahead and try to run with consumer data privacy laws.”

McCarty Carino: Is there anything unique or just additive in this Ninth Circuit judgment that has implications for the future of other similar laws?

Mackey: Yeah, some of the consumer privacy provisions, again, they’re directed towards children, but they sort of embody larger principles. So they embody things like data minimization. There’s also some good things in the law that said, for kids, you actually can’t collect their geolocation information without getting some affirmative notice and opt-in consent. And really you’re supposed to sort of tell them the moment you’re collecting it, “Hey, we’re collecting this precise information. If you would not like us to, let us know and we’ll stop.” And again, those didn’t raise any issues. So just these fundamental ideas of before you collect information, you need to ask for it, you need to explain why you’re collecting it, and then you really need to limit what you do with that information downstream just to deliver the service or the reason why the person gave you the information in the first place.

McCarty Carino: So big picture, how would you kind of describe the landscape for these legislative efforts to regulate the internet for kids?

Mackey: Lawmakers are consistently running into a wall, which is that they’re consistently trying to either directly or indirectly regulate the content that is available online for both children and adults. And they keep running into, they keep hitting the First Amendment wall. And so what we have been saying is “Why don’t you try a different route, one that doesn’t have the same sort of brick wall that you’re going to run into, and pass consumer data privacy laws?” And indeed, we filed a brief in the California case that basically tried to lay this out to the Ninth Circuit to try to explain that parts of this law are problematic, but these data privacy provisions can be implemented consistent with the First Amendment.

So we’ve been really trying to push lawmakers to enact data privacy protections, because we think not only are they consistent with the First Amendment, they’ll actually be implemented and then they actually help people. They help both kids and adults. We sort of change the fundamental problems of so much of what we see online, which is like unconsented data collection that is then used by online services to turn around and either target content to us that many lawmakers feel is harmful or target ads at us. And so a privacy law that really sort of comprehensively changes that and gives people meaningful rights and protections would be a game changer, and also wouldn’t run into the brick wall that lawmakers continue to run into.

More on this

As we noted, these same issues are coming up in many different states, and potentially at the federal level. The Senate has already passed the Kids Online Safety Act, which would require tech companies to take steps to prevent and mitigate harms to children and default to higher privacy settings. Its fate is far from assured in the House, as it’s faced opposition from multiple constituencies, including those who say it conflicts with the First Amendment.

And we could get some further clarification on related issues when the U.S. Supreme Court hears a challenge to a Texas law, which requires age verification for porn sites. The high court has agreed to take up the case in its next session starting in October.

Earlier this year, we spoke with Nicol Turner Lee at the Brookings Institution about the prospect of federal data privacy legislation for kids. She said it would probably make a lot more sense to start with some standards for all consumers, seeing as there currently aren’t any at the national level.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer