Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Donate your vehicle during the month of April or May and you'll be entered into a $500 Visa gift card drawing!

Tracking Down Fake Videos

MARY LOUISE KELLY, HOST:

Also in all Tech Considered, we continue our look this month at the many ways tech can be used to influence or undermine democracy. Today - deep fake videos.

AUDIE CORNISH, HOST:

The Defense Department considers them enough of a concern that it's working with outside experts on ways to detect them and prevent them from being made. Hany Farid is a computer science professor at Dartmouth College involved in the project. Welcome to ALL THINGS CONSIDERED.

HANY FARID: It's good to be here.

CORNISH: We're talking about videos on the radio, so I'll need a little explanation. And here is a clip of a video that you've chosen for us. And we'll play it first, and then you can tell us what's going on.

(SOUNDBITE OF VIDEO)

JORDAN PEELE: (As Barack Obama) We're entering an era in which our enemies can make it look like anyone is saying anything at any point in time, even if they would never say those things. So, for instance, they could have me say things like - I don't know - Killmonger was right. Ben Carson is in the sunken place. Or how about this? Simply - President Trump is a total and complete dip-[expletive].

CORNISH: So that sounds a little bit like President Barack Obama. But what were we really looking at?

FARID: And if you watch the video, it very much looks like President Obama. So what that is is a very sophisticated and technically new type of fake that we are seeing come out and that is being generated not by a talented artist sitting down frame by frame manipulating content but allowing a computer to generate the fake content for you. So what that video is is actually the actor Jordan Peele doing a very good Obama impersonation. And then the computer synthesizes the video, in particular the mouth of President Obama, to be consistent with what is being said.

CORNISH: How much of an expert do you need to be to make a video like this?

FARID: If we were having this conversation two years ago, I would tell you you had to be a fairly sophisticated, very good tools, lots of money and lots of expertise. And what's happening - we're simply making it easier and faster. And so today, lots of people who don't have access to the most sophisticated technology can now do that because all of the source code is available for download. And the expectation is that within the next year or two, it's going to be easier and easier and easier. That's the trend. And the output, the videos that we actually see, are going to be more and more and more sophisticated.

CORNISH: Is this why DARPA, the Defense Advanced Research Projects Agency, is working on this issue? Do they consider it a national security priority?

FARID: I think it's a priority on many levels. And I should mention that DARPA started working on this before these types of, what we call, deep fakes really emerged on the scene. I think you're absolutely right that this is a national security issue. You can now create a video of the president of any country saying I've launched nuclear weapons against another country. That content can go viral online almost instantly, and you have a real threat to security. I think it's also a threat to democratic elections when anybody can create video of politicians saying and doing just about anything.

CORNISH: And that you expect to see in upcoming elections.

FARID: I think there's almost no question that we're going to see it in the midterms, and we're already seeing issues in other parts of the world with elections. I think almost certainly we're going to see this unfold in the next two years. There's almost no question about it.

CORNISH: In the meantime, most of us are not digital forensics experts. What can we do to tell the difference?

FARID: Today, I would say that many of the fakes can be detected visually because they have artifacts, but that's not easy, and it's very easy to mistake an authentic video for a fake video.

CORNISH: And you said artifacts, meaning little visual glitches.

FARID: Yeah, like things don't look quite right. But the problem is that all video has glitches in it because of the compression that is inside of the videos already. So it's a very tricky business. And in many ways, the consumer of digital content should not rely on simply looking at something and being able to tell if it's real or not. We have to rely on good, old-fashioned fact-checking. We have to do our due diligence until people like me get our act together and really are able to distribute forensic techniques that work at scale, but we're not there today. I don't think we'll be there in the next few years. And in the interim, I think we simply have to change the way we consume digital content and become more critical.

CORNISH: Hany Farid, thank you so much for speaking with us.

FARID: It's good to be here. Thank you.

KELLY: Hany Farid is a computer science professor at Dartmouth College. He is working with the Defense Department on ways to stop deep fake videos. Transcript provided by NPR, Copyright NPR.

You make NHPR possible.

NHPR is nonprofit and independent. We rely on readers like you to support the local, national, and international coverage on this website. Your support makes this news available to everyone.

Give today. A monthly donation of $5 makes a real difference.