No matter the season, there's always a reason to support Marketplace. 💙 Give Now 🎁
Facebook Fallout

Cambridge Analytica, Facebook and the new data war

Molly Wood Mar 19, 2018
HTML EMBED:
COPY
Daniel Leal-Olivas/AFP/Getty Images
Facebook Fallout

Cambridge Analytica, Facebook and the new data war

Molly Wood Mar 19, 2018
Daniel Leal-Olivas/AFP/Getty Images
HTML EMBED:
COPY

In the age of social media, we should add a coda to the saying that knowledge is power: Data is a weapon.

No one has claimed to weaponize data better than Cambridge Analytica, a political data firm that acts as the American arm of the U.K. company SCL Group. Cambridge Analytica counts conservative billionaire Robert Mercer as a primary investor and Steve Bannon as a former board member. Its goal is to use data to build detailed and deeply personal, psychological profiles about people and then target them with what is essentially emotional manipulation in the form of ads.

The practice relies, obviously, on a lot of data: from social media, data brokers, voting records, loyalty programs and all kinds of other sources. When the data harvesting got too difficult and expensive, Cambridge Analytica partnered with an outside academic to hit the Federal Reserve of personal information: Facebook.

The researchers paid about 270,000 Facebook users (who were actually all working as Mechanical Turks, Amazon’s human-for-hire service) to install a quiz app on their profiles. Facebook was told the app was for academic research, and the small investment yielded large results, according to stories this weekend in the Observer of London and The New York Times — some 50 million Facebook profiles in all.

That’s possible because of Facebook’s data policies at the time. If a developer got permission to access one user’s profile, it also got access to that person’s friends — and most likely to friends of friends — in the form of their likes or comments or tagged posts.

Facebook said it has changed its policies about what apps can gather about users; it also said that the work of researcher Aleksandr Kogan and his company, called Global Science Research, violated its terms of service in two ways. First, it gathered data under the guise of academic research, but then sold it to Cambridge Analytica. And second, it didn’t get permission to use the data of all the friends and friends of friends who were swept up in the data collection. For their part, Cambridge Analytica and GSR said everything they did to collect user data was permissible on the platform. 

Also worth noting: Facebook only banned Cambridge Analytica and its researchers on Friday, when news reports about the incursion hit The New York Times and The Observer. As for the data, although Facebook said it asked the companies to delete it, the Times reported that its reporters had seen some of the raw profiles during its investigation.

And here’s the thing. We’re used to leaky data at this point — after all, it’s routinely stolen from the likes of Target or Yahoo or Nordstrom or Equifax.

But the use of that data is evolving far beyond what most users can even imagine. Cambridge Analytica says that it can psychologically profile users and target them so specifically that it can emotionally manipulate them — and that all of this psychological warfare has a specific goal: to affect political outcomes, and to affect them in the direction of a specific ideology.

But does it work? Can such hyper-targeting, say, convince people to stay home instead of going out to vote or change their votes from one candidate to another based on messaging that’s so precisely targeted it feels telepathic? It’s unclear, although Cambridge Analytica and Facebook are currently under investigation in the U.K. for potentially playing a role in manipulating the Brexit vote. Cambridge Analytica was consulting for the Leave campaign.

As for the U.S. presidential election, it seems as though Cambridge Analytica’s consulting was a relatively small part of Donald Trump’s 2016 campaign. The company also did consulting for Ted Cruz. But the story has shifted, and there will be many more questions. Massachusetts Attorney General Maura Healey is launching an investigation into both Facebook and Cambridge Analytica.  

Lawmakers from both sides of the aisle are once again calling on Facebook for answers — but answers may not be enough. You can expect more calls to regulate Facebook as a result of these revelations. And as data keeps getting more and more centralized in a few mega-platforms like Facebook and Google, the question of who owns that data and who can control it will become ever more acute. Europe’s General Data Protection Regulation, which lets consumers request to be forgotten from company databases, could be a blueprint for consumer protection advocates calling for U.S. legislation.

Cambridge Analytica and Global Science Research are the tip of the spear of data weaponization, and whether the techniques work now or not, they’re going to get more sophisticated and manipulative over time. In fact, the techniques are just as likely to come from Facebook itself. After all, it hired one of the guys who pioneered it.

Yep. Global Science Research, the company that developed the quiz app that gathered all that data, was founded by two guys: Aleksandr Kogan and Joseph Chancellor. Joseph Chancellor has a Ph.D. in social and personality psychology and did post-doctoral work at the University of Cambridge. He was hired by Facebook in 2015, after Kogan’s quiz app sucked up all those Facebook profiles. Facebook has said it is investigating Chancellor’s hire.

And don’t forget that even before Cambridge Analytica was grabbing user data off Facebook for emotional manipulation, Facebook itself was conducting a secret study during which it deliberately manipulated the emotions of its users using specific posts in their News Feeds.

And as recently as last May, Facebook admitted it had conducted research into how to identify the emotional states of young people using its platform, and two of its Australian executives shared that information with advertisers. The company denied it would let advertisers target kids on its platform when they were feeling “insecure” or lonely, only that advertisers might be interested in that information in the aggregate. (Not significantly less gross, I think we can agree.)

And of course, Facebook has also been accused of letting advertisers exclude people by race or ethnicity — including ads for housing, employment or credit, and of allowing advertisers to target their messages specifically to groups like “Jew haters.”

We’re far past the point of asking whether or what information we should share with Facebook. Even if you remove yourself from the site now, your legacy of interactions lives on in your friends’ profiles, and your data has likely been shared or sold hundreds of times over. Facebook will not self-regulate itself out of this situation, because this situation is its business model. If data is a weapon, Facebook is the biggest arms dealer in the world. It’s time to approach it with that in mind.

 

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.