icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
1 Jul, 2019 15:15

Deepfake technology shows the emperor (or the girl of your dreams) has no clothes

Deepfake technology shows the emperor (or the girl of your dreams) has no clothes

In the latest example of artificial intelligence at work, algorithms can now ‘undress’ a woman, or put false utterances into a politician’s mouth. What’s at stake is nothing less than our perception of reality.

Young readers of comic books circa 1970s will certainly remember the back-page ad section, where everything from Charles Atlas’ muscle-building program to live sea monkeys was shamelessly hawked to unsuspecting adolescents. The most popular item by far (I’m guessing) was the so-called ‘X-ray specs’ – a pair of horn-rimmed glasses designed on “scientific optical principles” that allowed the credulous wearer to see straight through clothes. How many kids blew their weekly allowance on that ploy is anyone’s guess, but today those same, now-matured consumers have a chance to be fooled once again.

Today, new software dubbed ‘DeepNude’, perhaps the distant country cousin of ‘Deep Throat’, enjoyed 15 minutes of fame for its machine-learning algorithm that magically ‘removes’ clothing. Light years ahead of its clunky comic book predecessor, and sounding no less chauvinistic, the application uses “neural networks” to undress images of women in a mouse click, “making them look realistically nude,” as Motherboard duly reported.

The fun and games, however, came to an abrupt end last week when the creators of DeepNude announced in an apologetic tweet “the world is not yet ready” for such advanced technology, which opens the door to an assortment of wolves, like ‘revenge porn’ attacks, not to mention the troubling objectification of women’s bodies. However, I’m guessing the real reason DeepNude yanked its product has less to do with moral and ethical considerations than the possibility of being hit with a massive lawsuit over privacy claims. But I digress.

Although it was refreshing to see DeepNude withhold their disrobing services, one thing can be said with absolute certainty: we have not seen the end of it. Already it is being reported that altered versions of the app are being sold online, which should shock nobody. As history has proven on numerous occasions, once old Pandora’s Box is cracked open it is nearly impossible to return the escaped contents. So now what we can look forward to is a slew of images appearing online of women in various stages of undress, which should qualify as a form of cyber-bullying, at the very least. Many females will be forced to endure untold indignities as a result of this technology, especially in the early stages when the fad is still fresh, while it is not so difficult to imagine some girls actually resorting to suicide as a result of it. Yet that is just the icing on the cake as far as ‘deepfake’ technology goes.

Also on rt.com We’re on a dangerous path: From fake news & fake porn to fake reality

As if undressing a woman with an application were not creepy enough, there is yet more technology that allows people to superimpose the head of one person over that of another. This brilliant app came to light earlier this year with deepfake productions of Hollywood stars ‘appearing’ in porn films. The end product was, according to Variety, “convincing enough to look like hardcore porn featuring Hollywood’s biggest stars.” Andreas Hronopoulos, the CEO of Naughty America, an adult entertainment company looking to cash in on deepfake productions, proudly told the magazine, “I can put people in your bedroom.” And yes, that’s supposed to be a good thing.

Predictably, however, this horny little app promises to have a shelf life about as long as the Internet’s past ‘ice bucket challenge’. People will eventually get tired of the novelty of watching Brad Pitt, for example, fornicating with so-and-so’s mother down the road and the world will turn to some other ephemeral trend for its cheap thrills.

So then what is the big deal? If deepfake videos are just some sort of passing trend that will quickly lose their shock value then where is the harm? I’m no computer expert, but as a journalist I can foresee this technology eventually having serious implications for the news industry.

Also on rt.com Face swap porn: Naughty America to superimpose viewers heads onto actors’ bodies

First, take a moment and watch the lifelike renditions of historical figures that researchers at the Samsung AI Center in Moscow were able to create here. In another five years, with artificial intelligence progressing at an exponential rate, it will likely be impossible to determine whether people featured in a video are real or computer-generated. That will make for an extremely unsettling situation. Reality itself will be turned on its head and people, already very cynical about the mainstream media, will become more skeptical about everything they are shown. Attempts at altering reality are already on the rise.

In January, for example, an employee at Fox affiliate Q13 was terminated from his job after the station aired doctored footage of President Donald Trump delivering a speech from the White House. The video was altered to make it look as if Trump was protruding his tongue as he spoke, while the colors were intensified to make the president’s skin and hair appear orange.

Meanwhile, comedian Jordan Peele last year made a fake video in which he poses as former president Barack Obama calling his successor, Donald Trump, “a total and complete dipshit.” Although it may seem hilarious now, what happens when nothing is believable any longer? When every single story – from those related to government corruption to war crimes – are met with cynical skepticism, as if we were all watching some harmless Hollywood production? What happens when the only things we believe are those that we witness with our own eyes?

Finally, there is the question of investigative journalism. On the one hand, individuals who are implicated in some sort of a criminal act via video will be able to resort to ‘plausible deniability’ by saying the footage was doctored or fake. This will make the presence of reliable (human) witnesses of any criminal act all the more critical.

What all of this suggests is not less human involvement in the collection of news and information, but indeed far more is needed. As artificial intelligence becomes more powerful to the point when reality becomes hard to discern, people will rely on human witnesses to verify every act. What happens when the humans on the ground serving as impartial witnesses can no longer be differentiated from the robots living among us in some dystopian future? Well, that’s another question entirely, for another day.

@Robert_Bridge

Think your friends would be interested? Share this story!

The statements, views and opinions expressed in this column are solely those of the author and do not necessarily represent those of RT.

Podcasts
0:00
28:21
0:00
26:3