When people think of harmful Artificial Intelligence, we usually think of the Rise of the Machines revolutions where robots get so intelligent they take over everything. The honest truth is there are many other ways AI can be harmful and if you’ve heard about Deepfakes then you’ll know exactly what I’m talking about.
What are deepfakes?
The word of deepfake is a combination of deep learning and fake which are the two primary elements used to make what is known as human image synthesis. Ok technical terms aside, deepfakes are essentially videos with images of other people placed on top of other images or videos. People who make the videos use a machine learning technique called a “generative adversarial network” (GAN)
https://youtu.be/gLoI9hAX9dw
As noted in the video above, they have been banned from major websites but still remain quite easy to make. The ban on major websites may not really count for much in this day and age when people can use platforms such as WhatsApp to send whatever they want without any responsibility being traced back to them.
How harmful can this get?
Deepfakes have been used to create fake celebrity x-rated videos along with adult content for revenge purposes and with enough images of a person it can get very difficult to distinguish what’s real from what’s fake. Famous actress Scarlett Johanson recently said trying to fight deepfakes is a lost cause:
Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body.
In the modern day and age where fake news is rampant, this could be the next big thing that takes fake news into an ungovernable stratosphere. If left unchecked the damage from this could be devastating.
Can tech ever be governed before the fact?
The biggest problem with technology and laws surrounding it is the fact that laws to regulate technology are usually made after the fact. So something disastrous usually has to happen before action is taken as most of the regulators are out of touch with these things when they are building up.
What’s wrong and what’s right?
Another problem that will definitely arise is in determining which deepfakes were intentionally meant to cause chaos and which ones were meant for satire. It’s a blurry line and one can see how confusing this might get down the line.
How can one spot a deepfake?
At the time of writing, deepfakes have not yet been perfected. The eyes have been the weak link for most of these videos as they tend to be lifeless since are taken from still images. The eyes don’t blink and generally, the face has some jerky movements.
According to Komando, a site which was covering a deepfake incident involving a 40-year-old woman in Australia, human psychology itself acts as a red flag when it comes to these videos:
Another factor that causes deepfake rendering flaws is human psychology itself. Similar to other animation programs, you can’t just cobble together a large number of snapshots and have software and artificial intelligence perfectly mimic the personality and respective idiosyncrasies of a human being.
Hopefully, a solution can be worked up before the deepfake revolution causes a storm and there are serious threats placed on people’s lives and livelihoods because of these irresponsible fake videos…
2 comments
Dark side? Have you not seen Terminator?? These AI’s will end us all!!!!! But yeah, this one is out of the archives. From what I came across a little while ago, the main solution that has managed to keep up is, interestingly enough, AI. Researchers have been training neural networks to spot deep fakes to great success. I forget the source now but I do remember the ai was capable of spotting deep fakes through many successive levels of video compression, very handy in light of social media sharing.
So at least as long as you have the compute time, data and processing power, you can spot fakes reliably. The problem is, it will always be reactionary unless the EU and other like minded internet tyrants legislate deep fake filtering between everyone and the net.
I can implement the same program. I have it right now in my PC