Should we worry about DeepFake?

by | Apr 11, 2023 | News, Security

What can you believe?

For years we’ve told you that you can’t believe every email you read.

But now we can’t believe every photo or video you see.

Deepfake is the term for when a picture, video or audio file is altered to put a different person into the content, generally with illict intent.  This isn’t where you change a picture by placing the face of one of your friends on someone else’s body in a fairly obvious and normally humorous manner.

Instead, it’s where an image or audio track is deliberately manipulated to decieve. The use of Deep AI learning tools make it much easier for almost anyone to access the technology and invent a story.

Examples include politics, although these images of the former US President are pretty obviously fake – AI-generated images of Donald Trump arrest reveal scary future – NZ Herald  or simply fun – Examples of Deepfake Technology That Didn’t Look Very Fake (inspiredelearning.com).  Some of the most alarming uses are extortion and there have been recent cases involving pornography

Tools like Reface and Deepswap, FakeApp and so one make it fairly easy for anyone to create these items. 

At this time, all of the tools we have seen are limited to creating files, rather than faking content in realtime.  That means that person you are talking to in a video call almost certainly is genuine, but who knows how long it will take before the technology allows real-time content faking?

The real concern for business is being scammed.   Can you trust that the person you are seeing or hearing on your screen is really them?  Can you be confident in any instructions from such a call, knowing they are legitimate?

The technology is imperfect and the clues to look for include lighting changes, unuusal facial expressions or freezing, poor lip-syncing and so on.  But as with the phishing emails, we will see that the fakes become harder and harder to spot.

Deepfake : “Nixon moonlanding disaster”

Deepfake : Is this Morgan Freeman?

There are tools to help detect fakes, like the ‘Microsoft Video Authenticator’ that debuted just before the last US election (an obvious catalyst for this technology) but remember that, unfortunately, the very tools that detect scams can be re-used to create them.

For more information, one of the best posts we’ve seen is : What are Deepfakes, Their Threats, and How to Avoid Them? (privacyaffairs.com)