There's just a few days left to snag some Marketplace swag at a discount when you... Donate Today! 🎁
Meta has a problem with hosting predators on its platforms
Dec 14, 2023

Meta has a problem with hosting predators on its platforms

HTML EMBED:
COPY
Katherine Blunt of The Wall Street Journal has been investigating how pedophiles use Facebook and Instagram to access child sex content. It's been a challenge for Meta to assess the full scope of the issue and curtail the activity, she says.

Warning: This episode includes sensitive content about the sexualization of children.

For several months now, reporters at The Wall Street Journal have been looking at the algorithms that recommend content on Meta’s platforms, specifically Facebook and Instagram. They’ve found that those algorithms promote child sexual abuse on a mass scale to users who show sexual interest in kids.

Meta argues that it uses sophisticated technology, hires child-safety experts and reports content to help root predators out.

But the problem persists, according to Wall Street Journal reporter Katherine Blunt. She told Marketplace’s Lily Jamali what she learned by setting up test accounts, including some that followed young influencers on Instagram. The following is an edited transcript of their conversation.

Katherine Blunt: It was really quite surprising that the speed at which it began recommending, specifically in the Reels short-video function of the platform, sex content related to children as well as adults. And just to add another layer to this, Meta monetized this content by running ads for major brands. And so the upshot of this is that these test accounts, the platform detected that it behaves like other accounts that have a sexual interest in children. And so it began recommending content based on that very, very limited signal.

Lily Jamali: Yeah, Facebook’s algorithms, you write, recommended groups with names like “Little girls” and “Beautiful boys.” What are users in those groups doing? What are they talking about, based on your reporting?

Blunt: Right, so the issues within Meta extend beyond Instagram. I mean, there’s overt discussion of child-sexualization content, actions, other terrible things. It’s been a real challenge for Meta to kind of, first of all, assess the full scope of this issue and make changes that are lasting and meaningful and being able to curtail that activity within these communities.

Jamali: At one point, one of these research accounts, or burner accounts as you call them, that the Journal had set up for this investigation flagged several groups, including one named “Incest,” through systems that one might use as a user to report these problems to the company. Facebook, when you go through those normal channels, said that the group didn’t go against its community standards.

Blunt: Right. Yeah, this has been a consistent challenge. A lot of aspects of the system are as automated, and it’s been producing I think what the company would agree is erroneous determinations when users flag groups. I mean, that’s a terrible example. There are others similarly terrible; there are also others that maybe exist in a bit more of a gray area, but, I mean, would be determined to go against the community standards. And yet the automated system doesn’t, for whatever reason, always detect the nature of what’s being discussed and exchanged.

Jamali: Yeah, but the company did respond quite quickly, it sounds like, when you revealed that you were investigating this.

Blunt: Yeah, they have the means to do that. Sometimes it takes multiple tries. But once you sort of maybe get some human involvement or escalate it to a certain level, the company has been able to take down certain offensive accounts. But I mean, these sorts of accounts proliferate. The company was, first of all, a bit surprised about the extent. And then from there, despite setting up a task force dedicated to addressing some of these issues, has not been making progress at the speed that it would like to actually make meaningful change.

Jamali: So what changes have they made, because The Wall Street Journal reported a few months back about, as you mentioned, how Instagram algorithms helped connect accounts that were focused on underage sexual content? After that, Meta supposedly had stepped up enforcement. Can you talk about what systems the company has put in place?

Blunt: So after that reporting, Meta organized a task force internally to specifically focus on these problems. And there’s been, I mean, I think almost sort of on a manual level, there’s been large-scale removal of problematic accounts. That’s not to say it fully solved the problem. I think, probably most significantly, the response the company has had, that it’s been expanding the use of technology that is meant to be much more attuned to when a certain account is exhibiting suspicious behavior. And so the technology is able to sort of score users based on the activity that they engage in. And if they’re engaging with child accounts in a certain way, or potentially looking at groups that have been flagged as problematic, I mean, it may be able to sort of flag them earlier so that they could be removed earlier.

Jamali: Yeah. Either removed or in some cases, it sounds like they’re also hiding certain groups from public view, not pushing them out.

Blunt: Yep. There’s been a large number of groups that don’t appear in search functions if you are looking for them.

Jamali: As well as taking down some hashtags related to pedophilia, we should mention that too.

Blunt: That’s important, yes. There’s been a number of hashtags that are known to be associated with problematic content. The company has been removing a number of them, but one of the challenges is that they kind of can repopulate with minor variations. That’s an ongoing challenge.

Jamali: Well, a very recent development is that Meta has started fully encrypting messages on two apps, Facebook and its Messenger app. Privacy advocates are very happy about this. But there is also a huge concern from the community that’s worried about predation on children, about how encryption could encourage that. How is Meta responding to those concerns?

Blunt: Yeah, it’s really significant. You know, a child predator could find and message a child on Facebook and the associated messaging app, and when those messages are encrypted, not even Meta’s staff will have ready access to that. And so to remove the ability of outsiders to view those communications, there’s a lot of concern that it’ll really hamstring efforts by law enforcement to be able to get to the bottom of tips and reports as they come in.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer