Why You Won't Be Able To Trust Anything You See Or Hear Soon

Regardless of where you are on the political hypercube, we can all agree that fake news has become a real problem.
Why You Won't Be Able To Trust Anything You See Or Hear Soon

Regardless of where you are on the political hypercube, we can all agree that fake news has become a real problem. We might each have different ideas about which stories count as fake news, but we all agree that they're a danger to democracy and breed sheep like a Nazi New Zealander.

With investigative journalism being pushed out of our lives to make room for whatever Buzzfeed does, it can seem like you can't trust anything but what your own eyes and ears take in. Well, because of recent technological advancements, please don't believe that either.

First, don't believe what you hear. In the 2012 election, one of the things that most damaged then-candidate Mitt Romney was a video of him talking at a private fundraiser. His mouth can't be seen clearly for most of it, but in the audio he clearly says that he's not going to court the votes of 47% of the country who felt "entitled" to things like food and healthcare despite paying zero taxes. A lot of the country already viewed Romney as an out-of-touch millionaire but hearing it in his own voice was the final diamond in his platinum coffin.


What, did you think he was getting buried in a pine box like the rest of us peasants?

But what if that audio had been faked? Or what if Romney could have made a plausible case that the audio had been faked?

In November, Adobe demonstrated an experimental project they've been toying with called Voco. Voco allows you to "Photoshop" speech, changing what the speaker said to whatever you care to type. Based on a 20-minute sample of me speaking, someone could fabricate a pretty damn convincing facsimile of me saying, "I hate children" or "Earth should have a self-destruct button" or maybe even things I've never said.

Much like Photoshop, not only will it create unrealistic standards for young girls to compare their speeches to, it will be used to distort records of what happened. Here's the video of the demonstration where an Adobe employee uses it to create an audio file where Keegan-Michael Key says he kissed his comedy partner, Jordan Peele:

That clip is terrifying. And not just because gay panic is still being used for laughs. Imagine if you could type whatever you wanted the President of the United States to say and he would say it. You could wield the power of Steve Bannon without looking like a sad muppet who got locked in a wind tunnel.

Adobe claims that it has roust methods for watermarking audio generated by Voco. The idea behind that is to put an inaudible digital signature on any files manipulated by Voco so that people could tell doctored audio from the genuine article. Then again, as I write this, the current version Adobe Photoshop is the most pirated program on The Pirate Bay. So claims that they can secure their software should be taken with a grain of salt.

People might get around their watermarks by copying their doctored audio using lossy methods, like copying to and from a cassette tape. Or Voco might just be hacked and reverse-engineered faster than you can falsify Adobe CEO Shantanu Narayen saying, "I told you so." And thanks to Voco's simple interface, that will be pretty damn fast.

We're about to enter an age when hearing someone's voice say something does very little to prove they actually said it. So what can we rely on instead? Do we need a video of someone clearly and distinctly saying something directly into the camera? Funny I should mention that...

Well, not "ha ha" funny but rather "oh Jesus, we're all fucked" funny. Face2Face is a new tech that allows you to take existing footage of a person and puppeteer their image with your own movements. You can take basically anyone who's been on CNN and create a new video where they ape whatever you're doing. That is to say, you could wield the power of anyone who actually breaks news.

So, in the video linked above, you'll see people in a lab puppeteering Putin and Trump pretty darn convincingly. This is still a relatively new technique so there are a few weird artifacts, but we aren't far off from video of a person that is basically created whole cloth. I don't think I need to spell out what could be done with a combination of Adobe Voco and Face2Face: Someone could have made a good Grand Moff Tarkin in Rogue One.

But that also means this technology will soon be used for less virtuous goals. Someone is going to use these technologies to tank a political candidate. Someone is going to use them to try to incite war. Someone might even use them to fake a video where Bill Murray shows up to somebody's birthday party. We just don't know. If we can't trust audio and we can't trust video how do we continue to have news?

Before we burn Atlanta to the ground screaming, "Technology has killed truth!" keep in mind that as long as there has been evidence, people have been faking evidence. Doctoring photos was a convincing art long before Photoshop came around. There's even a famous doctored photo of Lincoln, where his head was placed on a portrait of Southern leader John Calhoun ...

Why You Won't Be Able To Trust Anything You See Or Hear Soon
Library of Congress


Sic semper filters.

... And if you can't trust a photo of Honest Abe, who can you trust a photo of?

In the old days, we had to get information with relatively little help from sources of unmediated truth like audiovisual recording -- unmediated in the sense that they literally are media that are spliced, framed, composed, and sometimes outright staged. Ironically, we may be headed full-circle; treating every report (whether it has accompanying video or not) as a "trust us on this one."

That's going to be tough because in your gut, seeing is still believing -- even when you are explicitly told that what you are seeing has been manipulated. In a study on the reliability of eyewitness testimony, participants were asked to gamble alongside a partner. Though the partner never actually cheated in the game, subjects were told that their partners had. Some were simply told that their partner had been caught on tape while other subjects were actually shown doctored footage of their partner cheating. Subjects who saw the doctored video were three times as likely to sign a statement that they saw it happen (which they did not see because it never actually happened) as subjects who were merely told that their partner cheated. For some subjects, this effect persisted even after they were told that the tape had been doctored.

Think about that: Doctored videos have a huge impact on people. And most people are gullible enough to think chance encounters on reality TV are real; which I guess means they think strangers just walk around town wearing lavalier mics just in case they need to be on camera.

As fake news spreads and methods of creating bullshit get better, we're going to have to get more critical about what we consume. That doesn't have to mean a lot of technical expertise. As masterful as a 'shop might be, you can recognize a picture of Stalin using a smartphone to order an Uber as an obvious fake. Because you know enough about the world to realize that's impossible.

Why You Won't Be Able To Trust Anything You See Or Hear Soon
Russian Federation


The Soviet Union would never tolerate Uber's labor practices

You're going to get a lot of fake bullshit thrown at you from all sides of the political Thunderdome. Some of it will be lazy, sloppy misdirects, some will be outright denial of fact, and soon, some will be slick, difficult-to-detect seemingly live video. Some of it will bolster your existing worldview and seem to come from "Your Side" and will therefore be harder to detect.

We've got to be cautious about where our information is coming from. We've got to incentivize our news sources to do their homework about their sources, not just click on whatever the latest unvetted sensational crap is. At the same time ... am I going to risk being second to tweet about a stupid political gaffe? Get real.

Aaron Kheifets used to worry that everyone else had superpowers and were just playing it cool around him so he wouldn't feel bad. You are allowed to follow him on Twitter.

Scroll down for the next article
Forgot Password?