Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Donate your vehicle during the month of April or May and you'll be entered into a $500 Visa gift card drawing!

Granite Geek: 'Deep Fake' Videos a Latest Tech Scare

NHPR File Photo

  This is All Things Considered on NHPR. I'm Peter Biello. Photos are easy to fake given how common programs like Photoshop are now. New technology is making it easy to fake videos. That is, it is becoming very easy for video editors to graft the image of your face onto someone else's body. And that is problematic, especially if your face ends up on the body of someone doing something offensive or illegal. These videos are considered 'deep fake videos; and Granite Geek David Brooks of The Concord Monitor is here to talk about them. 

 

Welcome David.

Glad to be here.

Tell us about that term deep fake. What is that referring to?

So deep fake is a fairly new term that's being used as a category of videos so far pornographic videos in which images of famous people are used almost entirely actors are face swapped or morphed onto bodies of people in the video. And these you can find them online if you are look. They're being made and shared by an unsavory group of people.

How easy would it be at this point for a video editor to for example take my image or your image and grafted onto an existing video of someone we've never met doing something we would never do. How easy is that?

Right now it's not possible because there aren't enough images of you and me and you know normal human beings. There aren't enough images and videos of us out in the wild for these these people to grab and use to build the model. So the reason right now that all these deep fakes - or at least the main reason right now -- all of the fakes are involving actors and actresses is there's lots of video of them online. You can grab them and the software can analyze the face from all sorts of different angles and all sorts of different lighting and can use that analysis to make it realistic as they swap it into the video for you and me there. There aren't that many images and videos out there right now so we're kind of "security through obscurity" is the geek term and that's kind of where we are at the moment. 

I'm imagining people right now listening to what you're saying and thinking, 'Oh god I have hundreds of videos of myself and pictures of myself on my Instagram feed on Facebook,' maybe because of that I'm more vulnerable. Is that the case? 

That's possible. As I say I think the state of the art at the moment is ordinary people like us are safe. But the software is improving quickly, it's improving invisibly in the sense that you know evil people are doing it. So who knows how long that will be true.

Is it a crime to create one of these deep fake videos?

That's a good question. That's a complicated question. That was sort of the essence of my column today. It's certainly a crime to create them about you and me and ordinary people. If you created one of these of a famous person, it may or may not be a crime depending on a number of different things. So, for example, there's First Amendment right to do parodies of famous people even if they're repulsive. That's been held up. So that might cover it. It's far from clear. As is often the case, the technology is racing way ahead of the law. 

The legality of the issue here is informed in part by a case that you wrote about this week in your column. This case took place 13 years ago. A guy was caught swapping the faces of young girls onto the bodies of porn stars. He was initially convicted on child pornography charges but his conviction was overturned in part because no children were used in the making of the pornographic material and he didn't intend to share them with anyone. They were apparently just for him. So what's the relevance here of the intent to keep these deep fake videos private versus sharing them on the Internet?

Well that was a key to his defense, that along with the fact that they weren't actually children. So child pornography is illegal for a couple reasons -- one of them is that children are harmed when you make the pornography. That wasn't the case here. And another one is that children can be harmed when you share the pornography. And that wasn't what he tried to do. So that was why his case was -- the Supreme Court state Supreme Court overturned his conviction.

So if you are trying to share it - that's a problem?

Trying to share it puts you in a whole other legal category. Absolutely. And these deep fakes -- a deep fake face swap with adults onto adults. Then it depends on the adult. It depends on the adult depends on what you do with it share it or not and frankly it depends on how the law changes over the next few years.

One of the most striking possible consequences of this is the reliability of video as, let's say, evidence in court. Like with the prevalence of this technology call into question the reliability of things like surveillance video, for example, as evidence in a trial?

Sure seems to me it would. And to me that's kind of the scariest thing. As I mentioned in my column I can't imagine right now there aren't activists putting together deep fake videos about politicians that they're going gonna leak at some point. So, you know, a year from now, six months from now, whenever you see a video of a politician doing something illegal or abhorrent, well they'll say that's not me, that's a deep fake. You won't know. That's down the road it's going to be every video. And to me this holds out the rather alarming possibility that there will be no form of objective evidence that we can trust anymore. You can't trust photos. You can't trust audio files - they're too easy to fake. Well now you can't trust videos now either. You can't really trust anything at all. It's all 'He said she said.' And that doesn't bode well for our desperate attempts to hold onto civil discourse.

Well, David, we trust your reporting and we thank you very much for coming in and talking to us about this.

About this rather depressing topic.

You know sometimes we have to. Thank you very much. 

You bet.

That's David Brooks. He is a reporter for The Concord Monitor and the writer who could not possibly be video edited to seem any geekier than he actually is at GraniteGeek.org

Related Content

You make NHPR possible.

NHPR is nonprofit and independent. We rely on readers like you to support the local, national, and international coverage on this website. Your support makes this news available to everyone.

Give today. A monthly donation of $5 makes a real difference.