Support the fact-based journalism you rely on with a donation to Marketplace today. Give Now!
Illicit, sexually explicit deepfakes are becoming a problem in schools
Oct 29, 2024

Illicit, sexually explicit deepfakes are becoming a problem in schools

HTML EMBED:
COPY
Elizabeth Laird of the Center for Democracy and Technology explains how parents, teachers and students are dealing with the spread of AI-generated abusive images.

We know from various studies that young people are, unsurprisingly, using generative AI tools like chatbots and image generators, sometimes for homework, sometimes for fun and sometimes for malicious purposes.

A recent survey from the Center for Democracy and Technology found AI is being used among high school students to create nonconsensual, illicit imagery — in other words, sexually explicit deepfakes.

To learn more, Marketplace’s Meghan McCarty Carino spoke with Elizabeth Laird, director of equity in civic technology at CDT. The following is an edited transcript of their conversation.

Elizabeth Laird: Thirty-nine percent of the students who we surveyed — and our survey is nationally representative — said that they have heard of NCII that depicts someone associated with their school. That is of both authentic and deepfake NCII, so it could fall into either of those categories. Specifically, when it comes to deepfake, nonconsensual, intimate imagery, or NCII, 15% of students who we surveyed said that they have heard of this happening, and so to put that in terms of real numbers and real students who are affected by that, that’s 2.3 million high school students. This is a significant issue that is confronting K-12 schools.

Meghan McCarty Carino: And when we’re talking about people associated with their school that are being depicted, was it largely other students? Was it teachers?

Laird: That’s a great question. It was definitely one that we definitely wanted to answer. And we did find that students are both the most likely people to be sharing NCII and students are also the most likely to be depicted. So when you’re thinking about this issue and how it plays out in K-12 schools, it really is students doing it to each other.

McCarty Carino: And what did the students that you spoke to say about the fact that these kinds of deepfakes may be out there in their school?

Laird: So we did find that 51% of students who we surveyed say that female students are significantly more likely to be depicted compared to male students, where only 14% say males are more likely to be depicted. So this is an issue that disproportionately affects girls. They are more likely to be depicted by this and therefore more likely to suffer from all of the negative consequences that are associated with any type of sexual violence or harm that affects people.

McCarty Carino: To what extent were the parents or teachers that you spoke to aware of where these images were coming from?

Laird: So we found that students are the most plugged into when this is happening. So most of the students who we surveyed said that they do not believe that their job or their school does a good job of catching students who commit this. However, we did find that teachers were the second-most-knowledgeable group in that they may not know as much as students, but they still know a lot, and so they also reported that this was a significant issue in schools. Now, parents were very interesting because when we asked them, who do you think is most responsible for addressing this issue of sharing nonconsensual, intimate images in schools, they said that they are. They said, we are the people who should be the ones who are talking to our children about it. However, they are easily the most out of the loop, not being sure if their school has a policy about it, not having heard of this happening in their school. And so you have this dynamic where parents really are primed and want to help educate their children, but they are really out of the loop when it comes to this happening in the first place and then what their school is doing to prevent it.

McCarty Carino: Did it seem like the schools here had any kind of existing policies in place to deal with this problem?

Laird: Schools have, and they’ve had for a very long time, an existing legal obligation to create a learning environment that is free from sexual harassment. And so we wanted to know, what are schools doing when it comes to three things? One, preventing students from doing this in the first place. Two, when it happens, how are they penalizing students? And three, when it happens, how are they supporting the victims? And so of those three dimensions, we found that the majority of school actions are happening on punishing the person who did it. And the most common way that they are punishing the person who did it is that they are referring that student to law enforcement. Where we found the biggest deficiencies were in school efforts to prevent it from happening in the first place and to support victims. And just to give you another number, only 5% of teachers who we surveyed said that their school has provided resources to victims to get this content removed from social media.

And so maybe you’re thinking, well, you know, this is a new problem. This technology has only been around for, you know, a year and a half, two years. And so what I would say to that is that if there is, you know, a silver lining in this story, it is that because authentic [nonconsensual, intimate] images have been a significant issue for a long time. When we’re talking about minors, there are existing resources that can be utilized to connect victims with, to get this content removed from social media. So when thinking about what schools should be doing, so the two biggest gaps that schools should focus on is one, what are they doing to prevent this from happening in the first place? And this could look like updating their sexual harassment policies, telling students how they’ll be disciplined if they were to take this action, including this in their digital citizenship curriculum. What does it mean to be a responsible user of the internet? And then, you know, unfortunately, if this does happen, how do they connect victims with existing resources and support to help them, you know, navigate this and try and minimize some of the negative effects of this action by students?

McCarty Carino: And it sounds like parents were kind of the most out of the loop in your research. I mean, what would you say to parents of school-aged children with this kind of looming out there?

Laird: So I would say a couple of things to parents. I would say, one, the vast majority of students who we surveyed said that they did not know that this could lead to criminal and civil penalties. So I think a really important step would be for parents to talk to their children and make sure that they understand — because the school isn’t doing it right now — the severity of being involved in something like this. And then the second thing I would say to parents is to ask their school, you know, what are you doing to prevent this from happening in the first place? What are the consequences if my child does this? What resources are available if my child is the victim?

More on this

In August, a new Title IX rule took effect nationally that makes clear that sexually explicit deepfakes are considered sexual harassment. The Department of Education calls for schools to update their policies and offer training for staff and students.

Currently, 30 states have laws regulating sexual deepfakes, but when such images are discovered in schools, there’s a ton of variability in the legal ramifications, according to reporting in Politico.

“Who gets disciplined, how minors are treated and who is responsible for taking images to the police — varies widely depending on the state,” writes Politico’s Dana Nickel, who explains that variability can result in arrests or nothing happening at all when someone steps forward to report illicit images in schools.

As far as federal legislation to protect against deepfakes goes, well, Congress is still working on it.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer