Support the fact-based journalism you rely on with a donation to Marketplace today. Give Now!
The ins and outs of reporting on Facebook
Jan 3, 2024

The ins and outs of reporting on Facebook

HTML EMBED:
COPY
Jeff Horwitz, author of the new book "Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets," details his experience covering the social media giant, including its efforts to control access to information.

For reporters covering Facebook, getting the real story has only become harder since the release of the “The Facebook Files” in 2021.

The Wall Street Journal series, based on documents provided by whistleblower Frances Haugen, exposed the inner workings of the company now known as Meta, from its lax rules for VIPs to internal research on Instagram’s impact on teens.

Wall Street Journal reporter Jeff Horwitz writes about the challenge of covering the company in his new book, “Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets.” Marketplace’s Lily Jamali spoke with Horwitz about his coverage of Facebook and Meta and how the company’s platforms have changed over the years.

The following is an edited transcript of their conversation.

Jeff Horwitz: The products that Meta offers bear no resemblance to Instagram and Facebook a decade ago. These are just completely revamped features. This isn’t just posting things online and following your friends, this is much more mechanical. It is heavily driven by recommendations and what the company calls “unconnected content,” namely things that it chooses to show you rather than things that you choose to see. That has changed not just how we use social media, but how we interact online in general, how we consume news, how we talk to each other about things like politics. The ramifications are so much bigger than just they changed social media in a way that makes things better or worse.

Lily Jamali: Your book is this very comprehensive history of the last decade at Facebook, but it’s also your account of how hard it is to expose the company’s secrets. I want to talk a little bit about this turning point in your efforts as a reporter in 2019, in the thick of the [Donald] Trump administration, we’re going into the next election, when [Meta CEO] Mark Zuckerberg spoke at Georgetown University about free expression. He said, “In times of social tension, our impulse is often to pull back on free expression because we want the progress that comes from free expression, but we don’t want the tension.”

You write about what you saw in the comments section during that speech. What occurred to you as you were watching?  

Horwitz: This was a kind of full-throated defense of internet freedom of a previous era, right? What was happening when Mark Zuckerberg was saying those words was the tens of thousands of Facebook users were commenting on the speech, and a lot of them were heckling the hell out of him. The thing is, none of those comments ever appeared. You literally had to scrape all 40,000-odd comments to see them because the only ones that Facebook was selecting to show were ones that involved the words “thank you,” “love” and “congratulations.” Like, we’re talking some extremely crude filtering here.

Jamali: Those were very obviously not authentic comments?

Horwitz: Yeah. I think one of them was something like, “Thank you, big boy Mark. I love you.” And so, for me it was a really important moment in sort of forcing me to recognize that the thing that we’d all been focused on, which was “What does Facebook moderate?” and “What did they remove?” wasn’t really the right problem. It was actually an amplification problem. In other words, what voices are getting picked up? And why are they getting picked up? What are the mechanics of the system that is built to do that job, and whose interests does it serve?

Jamali: Yeah, there’s a lot of irony in that passage of the book. Then you get to the COVID-19 pandemic. How did the pandemic initially affect Facebook, which was coming off of a couple years of really bad press that included the 2016 election and the Cambridge Analytica scandal?

Horwitz: I think people there understood this to be every bit as much of a tragic and horrific world event as we all did. What I will say is that I think from a company level, this was just awesome. I mean, everything about COVID from the fact that people were at home, spending a ton of time online, to the fact that Mark Zuckerberg happened to employ one of the world’s foremost coronavirus experts, to the way the company responded by sending its employees home right at the very beginning for their safety and then sort of keeping the ship running in this kind of manic process. This was all amazing for the company. I mean, it truly was a point when the value of a worldwide social network really shone, that there was a lot of value to be had here.

Jamali: At the time, the 2020 election was on the horizon, and we were in those first few months of lockdown. You write about this moment when former President Trump published this incendiary tweet that was cross-posted from Twitter to Facebook. Twitter took it down, but Facebook did not. You position this moment as something of a turning point for workers at Facebook.

Horwitz: Things were already sort of getting a little less cheery, but then comes the “looting shooting tweet” from Trump in which he suggests, echoing a Miami police chief who used very bloody tactics to shut down protests. “When the looting starts, the shooting starts” was the phrase basically. It was this somewhat menacing, if not outright sort of inciting comment. And this was, I think, a really big deal for employees. It was seen as an instance in which the company was directly kowtowing to political pressure in a way that it had, at least publicly, claimed it would not do. This is supposed to be this sort of revolutionary Fifth Estate, as Mark Zuckerberg put it. This check on the other, on branches of government and even on the press, and it was instead bowing to someone who holds the country’s highest office, was the concern.

Jamali: You write that outside the executive ranks, the mood among employees was as close to mutinous as it has ever been. You talk about four days after the tweet, there’s a virtual walkout staged by workers. They’re a lot more open to airing their concerns outside of Facebook, and there are more leaks happening at this point.

Horwitz: This is where I think they lose a lot of employees. I think the company does start treating its employees a little bit more like a fifth column at this point and they start trying to lock down information because things are just leaking right and left. I think at that point there was also just a recognition that the election is really getting out of control. Even as the company is sort of fighting its employees on some of the activism fronts, it is also preparing for what would be a crash landing for the 2020 election on its platforms.

Jamali: My takeaway was that there were a lot of people inside of Facebook who wanted to do the right thing and were trying to, but because of the person leading them, they couldn’t always. Is that the conclusion you want readers to draw?  

Horwitz: I think yes, that is certainly true. That many of the things that were kind of described as intractable issues of social media aren’t intractable, they are in fact eminently solvable, or at least heavily mitigatable. But the platform right now represents the desires of the people who control it, and one person more than anyone else. And if there are issues with that, these are the direct results of product and design choices and choices about where to invest. Moderation doesn’t solve everything, but in recent years, the company has been cutting back on safety stuff, even though it runs a 40% operating margin.

Jamali: Do you think that your reporting in this book will change the work that has to happen to expose what’s going on inside of Facebook?

Horwitz: In the wake of the book, they cracked down heavily on internal information access, and this is something I think that I always suspected would happen. It’s been really disheartening, I think, for a lot of researchers that things are not as internally open as they once were. Even by the time that the Facebook Files came to be with Francis Haugen, they were already trying to restrict access, and things have just gotten tighter since. And so, I think there is a question about how this research gets done and there is more of a focus on trying to force transparency, to say that Facebook’s own reports about its moderation systems and its effects aren’t good enough, and there’s gonna need to be some sort of more standardized information disclosure.

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Daisy Palacios Senior Producer
Daniel Shin Producer
Jesús Alvarado Associate Producer
Rosie Hughes Assistant Producer