California AG: Without federal law, kids’ online safety starts with the states
Jun 11, 2024

California AG: Without federal law, kids’ online safety starts with the states

HTML EMBED:
COPY
Rob Bonta is helping lead the effort to force social media companies to increase parental controls and protect underage users.

Back in January, a U.S. Senate committee probed executives from Meta, TikTok, X, Snap and Discord about social media’s effect on kids. During a heated exchange with Missouri Sen. Josh Hawley, Meta CEO Mark Zuckerberg stood, turned and apologized to families of victims who were sexually exploited on social media platforms.

No federal legislation on the issue has become law, but some states are taking the lead. New York just passed two laws aimed at regulating social media, and California Attorney General Rob Bonta is pushing similar legislation in his state.

Marketplace’s Lily Jamali sat down with Bonta in his office in Oakland, California, and asked him about the Protecting Our Kids from Social Media Addiction Act.

The following is an edited transcript of their conversation .

Rob Bonta: It starts with certain defaults that are changeable, and so parents can choose what they want. One of the defaults set by the bill is having a chronological or organic feed instead of an algorithmic feed. It can be changed to an algorithmic feed if that’s what you want, and a parent could go on and say, “I don’t want any of this stuff, I want it exactly the way Instagram or Meta or TikTok wants it.” People should be able to choose.

Lily Jamali: The heads of those companies that you mentioned, some of them have said that they already offer these choices to parents, they just have to figure out how to do it. And then parents say that it’s actually really hard to figure out how to set these settings. Does this law address that?

Bonta: They need to set them as a default. They’re not the default now, that’s absolutely true. Whether some tech genius can figure it out or whether they make it easy, or even possible is another story entirely. But we don’t think any of these defaults are offered, nor are the choices available to parents.

Jamali: So, when you hear Mark Zuckerberg say that Meta already includes some parental controls on these apps, you say that’s just not true?

Bonta: No, I think there’s some parental controls on the apps, but they’re not the ones that we’re providing for here. Just because you do something and describe it as parental controls doesn’t mean you’re doing the things that this bill requires. And so, we’re very specific about it. Getting rid of the algorithmic feed as a default and having a chronological feed, having that specifically. Not just any parental controls, something that you can check the box and say you have parental controls. So, we have very specific requirements in the bill.

Jamali: So what’s the update on this bill right now?

Bonta: It’s got bipartisan support, and it’s moving with a ton of momentum through the legislature. We expect it to go through, just like our Age-Appropriate Design Code Act did with bipartisan support. This is bipartisan, I want to be very clear about that. It’s not it’s not just a talking point. It is rare to see things that are bipartisan these days, unfortunately, and there’s a great deal of concern across parties about our children’s health.

Jamali: The Age-Approriate Design Code Act, which you mentioned, had bipartisan support, the governor signed it, and it is now stuck in the courts. It was supposed to take effect in just a couple of weeks.

Bonta: Yes, the trial court struck it down for now, wrongly in our view. But we respect the decision, but we disagree with it. And that’s why we’re appealing it. And we think that we will prevail on appeal.

Jamali: So, tell me about this other law that’s part of this pair of social media bills that you’re pushing.

Bonta: AB 1949, authored by Assemblymember Buffy Wicks, we’re the sponsor. It makes sure that for any child, so anyone under the age of 18, tech companies can’t collect, use or sell their data. It’s simple and clean. We just don’t think you should use, collect or sell the data of children without explicit consent.

Jamali: Where do you think both laws go from here? Let’s say they sail through the legislature, Gov. Gavin Newsom signs them. Do they end up in the courts just like the California Age-Appropriate Design Code Act?

Bonta: I think so. I think it would be naive to think otherwise. I think they will go through the legislature if the current indications and past indications are an indication of the future. And I’m hopeful that the governor will sign them. I know that he cares deeply about the health of our children and their experiences online. And the modus operandi up to this point has been to challenge them in court, specifically by the industry association NetChoice, which brought the lawsuit on the Age-Appropriate Design Code. So, I think we’ll be in court. I think you’ll see the arguments that you expect to see, First Amendment challenges, Section 230 challenges. We’re very aware of those laws, and we believe that these bills are specific and by design do not violate Section 230 or the First Amendment, commercial speech. And we are focusing on features and design which the tech companies have created, not on any material or content or speech that they’re publishing.

Jamali: So, New York just passed two pretty similar bills. One would also restrict access to these algorithmically driven feeds, and the other one would shield personal data. Does California still need to proceed with the efforts that you’ve spearheaded in light of this New York law?

Bonta: Oh, yeah, for sure.

Jamali: How does it interface with what you’re doing here in California?

Bonta: It’s the same approach by design. We’ve worked together. We are in touch with the New York leaders, the attorney general, the legislators. I’ve been on panels with the authors of these bills. We are partners, and both New York and the state of California have a wonderful, in my view, race to the top.

Jamali: Was there ever a time when you thought that these social media companies would put the appropriate guardrails in place on their own?

Bonta: I didn’t know one way or the other. You always hope that corporate entities without being sued or required by law will do the right thing. They have the research to know that the mental health of children is harmed by the frequency and duration of use of their platforms. They’ve lied about it to the public, they’ve said otherwise, and they continue to use the features and design that harms children. That’s very disappointing. I would hope that once they at least had the knowledge internally, which they do, that they would change their practices, which they haven’t. So, because they haven’t, we are forcing them to with our lawsuit and with our laws. And so, we will always champion the health and safety of our children and protect them with everything we can.

Some of the behavior that we’ve seen is very unfortunate. We’ve seen things like plastic surgery filters, which we know create body image challenges. Meta’s staff recommended against using these filters because of the impact that it had on young people and their mental health, and Mark Zuckerberg personally vetoed it. We’ve also had Mark Zuckerberg testify in front of Congress and say when someone under 13 is on our platform, we de-platform them, when the data shows that millions of kids under 13 were on their platform. People shouldn’t be lied to. Children shouldn’t knowingly be hurt by platforms when there are ways when the platform can both be incredibly, wildly financially successful, and the kids can be safe. This is common-sense stuff; every parent knows it and feels it. And we’re going to make sure it gets done if corporations aren’t going to do it on their own.

Jamali: And if you’re comfortable sharing a few thoughts on this, what has your experience been like with your kids? You have three kids, right?  

Bonta: Yeah, three. You know, during the COVID-19 pandemic, one of them had a lot of mental health issues, and a lot of it was because of social media. So, you know, this is not something that’s intellectual or academic for people. This is lived experience, real life, we don’t need the data to persuade us what we already know from personal experience. But the data is affirming, knowing that it’s not just an outlier or a rare instance. So, in the absence of corporate responsibility, we’ve identified solutions, and we will press those, and we believe they will be successful, and we believe they’re lawful.

More on this

You may remember a recent episode on our show about the California Age-Appropriate Design Code Act. The code would require websites that children are likely to visit to provide privacy protections by default. We had NetChoice’s general counsel, Carl Szabo on the show back in March for a friendly debate with Megan Iorio of the Electronic Privacy Information Center about their differences on the legislation.

Separately, we reached out to Meta for its response to what Bonta said in our interview. A Meta spokesperson sent us this response:  

“We want young people to have safe, positive experiences, which is why we’ve built safety and privacy directly into teen experiences. We’ve developed more than 50 tools and features to do this, including ways for parents to set time limits for their teens on our apps, automatically restricting teens under 16 from receiving DMs from people they don’t follow, sending notifications encouraging teens to take regular breaks, and offering chronological feeds on Instagram. We also know that teens move interchangeably between many websites and apps, and different laws in different states mean teens and their parents have inconsistent experiences online. That’s why we support federal legislation that requires app stores to get parents’ approval when teens under 16 download any app.”

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer