DeepFakes are coming for ya!

with No Comments

We all expect technology to improve, and these days it’s happening at breakneck speeds. While most anticipated technical breakthroughs excite me, the one that is scaring me half to death is the ever-improving Deepfake.

What are deepfakes?

We’re all aware of photoshopped images which can range from crude, amateur attempts to deceive, to masterpieces that are undetectable by the naked eye and can only be disproved upon close digital inspection. We’ve also grown accustomed to very realistic CGI (Computer Generated Imagery) in movies, which add extra flair and excitement.

Deepfakes, however, use Artificial Intelligence (A.I.) algorithms to create video (usually of a person we might be familiar with), saying and doing things that they’ve never said or done. It can place people in situations that never actually happened. The technology has gotten to the point where it’s easy enough to use and realistic enough that even many discerning viewers will be fooled. The term comes from the A.I. concept of “deep learning” combined with “fake” and is more technically known as “synthetic media technology.”

Popular examples of how this technology has been used include this video of President Obama in the Oval Office, voiced by comedian Jordan Peele, saying things he never actually said.

Another is this video of Mark Zuckerberg giving a talk he never actually gave, with an actor impersonating is voice.

When closely inspected, it’s apparent that something is off. Obama’s facial expressions and blinks aren’t quite right and Zuckerberg’s head movements are a little unnatural. However, the casual viewer wouldn’t even notice, and the person who creates such videos can make the subjects say and do whatever they want.

Ramifications

In this era of fake news, it isn’t difficult to understand what the consequences of this will be as the technology gets better, cheaper, and more accessible. Your political opponent can be made to say or do whatever you’d like, giving you a clear advantage among your constituents. Imagine publishing a clip of a president declaring war, causing civilians to panic. The dissemination of such clips can cause much psychological confusion and get us to the point where seeing is NOT believing, and absolutely nothing is trusted anymore.

And then there’s porn…

There are several examples of realistic deepfakes made by placing someone’s head on a porn actor’s body. All that is needed is the right software and enough footage of the targeted victim that the A.I. algorithm can seamlessly incorporate the image into an existing porn scene. Doing this with someone famous is easy enough with plenty of footage available online from sites like YouTube and Facebook.

If you’re still not worried because you don’t consider y0urself famous, this technology is improving to the point where hours of footage aren’t even required and will soon get to the point where a simple picture of you will do. This all means that you too can be the subject of a deepfake if you are targeted by the wrong person. What would your employer, friends, or family think if they came across a very believable video of you having a non-consensual sexual tryst that never took place?

The increased use of this fast-improving technology can lead to catastrophic consequences.

Motion Capture I did of myself on a cartoon character. Something like this, which was unthinkable just a couple of years ago was done on an iPhone.

Face Swap and Filters – Apps like Snapchat and Impressions Face Swap Videos (on IOS) have taken advantage of advances in phone technology to introduce these fun features.

It is only a matter of time before processing power is good enough to create a convincing deepfake, all on your phone.

With people always anxious to share the latest and most shocking news, they almost never take the time to verify the source or make sure the story or image is credible in the least. Imagine what the Russians could have done with this technology when they presumeably wrought psychological warfare on the U.S. presidential elections of 2016. Let’s see what happens during the 2020 election!

Of course not all is doom and gloom since the technology also has lots of beneficial applications. Hollywood is already benefiting by being able to correct an actor’s mistakes, make them speak another language or cast roles with actors who have passed away. It’s just a matter of time before a film is released with a cast of nothing but A.I.-generated characters who are indistinguishable from real people. The technology will also provide artists with a broad new technical canvas upon which they can express themselves.

This Person does not exist!

www.thispersondoesnotexist.com

Go to the website and keep refreshing the page to generate a new person.

None of these people exist.

Although the results aren’t perfect, this website is a prime example of how A.I. can be used to generate realistic human faces. The website generates a completely new, random face every two seconds. The person in the picture doesn’t exist.

That’s all nice and dandy but my main concern is still the nefarious use of the technology.

How can this be detected and stopped?

Currently, it’s fairly easy to spot a deepfake upon close visual scrutiny or programmatic inspection. There is always something that’s just not quite right. In the near future, it may not be so easy.

Right now, people are talking about changing copyright laws and raising public awareness.

Knowing that human beings are who they are, I don’t think any of this will work as there will always be some person, group, or entity willing to break the rules and exploit the situation.

My favorite solution involves the creation of digital authentication standards, possibly involving one or several trusted blockchain ledgers which journalists, social media platforms, and the general public will be able to turn to in order to authenticate digital content.

In the meantime, don’t believe everything you see.

Pascal Antoine – CoolDigerati

It’s fun to share…

Facebook Comments
Follow cooldigerati:

Latest posts from