Support the fact-based journalism you rely on with a donation to Marketplace today. Give Now!
Do fake images need to look convincing to be convincing?
Oct 11, 2019

Do fake images need to look convincing to be convincing?

HTML EMBED:
COPY
Short answer? Not really.

Fake news and doctored images make navigating our social media feeds ever trickier. But the potential fakes meant to trick and sway opinion are more than just a hassle. They can threaten national security.

I spoke with Christye Sisson, a professor at Rochester Institute of Technology in New York, who is working with the Department of Defense to build a sophisticated tool that can identify fake images.

She and her students act like the bad guys, doing painstaking work to develop the most convincing fake images they can. They’ve learned a lot about what it takes to fool people, including that maybe they don’t need to work so hard. The following is an edited transcript of our conversation.

Christye Sisson in a lab at RIT. (Photo courtesy of Sisson)

Christye Sisson: That was a revelation, at least for me as a photographer in the photo sciences. I had that starting point that this image has to be — or this video — has to be visually convincing in order for it to even begin to be shared, begin to have the viewer believe what it is I was trying to get them to believe. But, I don’t know, the evidence is pointing to something different. And certainly, the venues are different. This is not law enforcement context; this is not national security type context. This is primarily social media, and the sharing of stories over social media [where] the images have clearly been tampered with and manipulated, but they are shared millions of times.

Jed Kim: That’s got to be heartbreaking. You spend so much time and energy making high-quality fakes, and then you see something that’s just cruddy and people believe it. Are you just like, “Come on!”?

Sisson: Part of it is that for sure, and there was a lot of that sentiment. I think it certainly opened my eyes to this idea that this goes far beyond a technical issue, that this is not something that can be solved with a technical tool. This is a broader question, particularly as it relates to why people share information and what prompts people to disseminate a piece of information or an image that’s sensational. In my very brief and anecdotal research, it’s the things that validate people’s beliefs and resonate with them emotionally.

Kim: Maybe this is something that we shouldn’t be too surprised about. I’m thinking of the blurry images of Bigfoot or the Loch Ness monster, that some people are just like, “Yep, there it is. Proof.”

Sisson: I think the thing that has shifted in the last 15 or 20 years is the sheer volume of imagery that we’re digesting. It’s one thing to see an image of Loch Ness or Bigfoot [and] question that. But it’s another thing to be able to turn around and disseminate that as a potential source of information, or to disseminate as if it were truthful. I think that that changes the dynamic, certainly. I asked my son about this, who’s a teenager, and I said, “What would prompt you to share information? Do you look into what the sources are? How do you determine if something is real on the internet or in social media?” He says, very matter of fact, “Nothing is real on the internet.”

That was a huge shift for me in mindset, because that was never my mindset. Images are truth, or at least our best resemblance of truth, certainly in photography as I learned photography, and what the image represents. That actually did give me hope, as discouraging as it was to see people sharing really poorly done images, the idea that maybe, just maybe, with this constant diet of imagery, that maybe future generations will not necessarily put their faith in images, and they’ll view everything with a skeptical eye.

A woman in Washington, D.C., views a manipulated video on Jan. 24 that changes what is said by President Donald Trump and former President Barack Obama, illustrating how deepfake technology can deceive viewers. (Rob Lever/AFP/Getty Images)

Related links: More insight from Jed Kim

Sisson wrote an article for Nieman Lab about her work and the implications for life today. She included an example of the kind of work her department is taking on — a video of a person speaking, when suddenly the voice swaps out for someone else. And if you’ve never met this person, you’d never know it’s not her voice. It’s messed up. Handy tip: She included a link to a how-to guide for reverse-image searching. She said it is good practice to incorporate until the day tech saves us from other tech.

Want to see a disturbing deep fake in action — like, action-movie action? YouTuber Aldo Jones made one of those videos where you take one famous person’s face and overlay it on footage of another person. He’s put the face of the most recent Spider-Man actor, Tom Holland, onto the body of Tobey Maguire in scenes from “Spider-Man 3.” The result is scary close to believable.

I have to say, it’s kind of hard to watch though — not because of uncanny valley issues or glitches. I mean, there is a little of that. No, it’s tough to see the video through to the end because it is, after all, “Spider-Man 3.”

https://youtu.be/fNWCwxr2cMg

The future of this podcast starts with you.

Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.

As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.

Support “Marketplace Tech” in any amount today and become a partner in our mission.

The team

Thanks to our sponsors