How does AI contribute to “Deepfakes” and what are they?

By:  |  Category: Blog, Security Friday, July 5th, 2019  |  No Comments
Deepfakes

Ahh machine learning…what a beautiful thing to automate algorithms and statistical models so computers can perform tasks without input or specific instruction. Beautiful yes, but also potentially dangerous as the world is beginning to discover with the take-off and increasing popularity of “deepfakes.”

What are Deepfakes?

Deepfakes are videos and images manipulated digitally, using machine-learning (AI) techniques to overlay existing images and audio onto source material.

Face-swapping is a good example and might be something you’ve used yourself or with a friend previously on the social media outlet “Snapchat,” where the image of both faces is cut out and transposed. Now while some of these are innocent and may literally have you on the floor laughing, there are serious implications to the development of this technology.

The future ramifications suggest a deeper lack of trust in media outlets and a threatened ability to determine what is real. Now instead of just reading the breaking “fake news” story, we can watch a deepfake video with superimposed audio and visual that will only polarize people’s opinions and views (even if they are entirely incorrect)t.

While this was something used comedically at celebrities’ expense in the past in a politically charged nation here the implications take on a new level of seriousness. AI being used to swap political figureheads’ words and photos in this palpably tense atmosphere?

Unfortunately, deepfake images don’t need source material, they can be created from scratch with only “modest investments” in machine learning and computing capacity. The repercussions go far beyond defamation of name and character and branch into risks involving National security and the U.S Government, or any government for that matter. Even now when a video or photo is “debunked” there are lasting effects.

So how do we “fight the fakes”? There is a tool recently developed (and many more currently in development to combat this outrageous problem), at the USC Information Sciences Institute that can detect fakes with up to 96 percent accuracy using subtle head and face movements and what USC calls “unique video “artifacts.” These artifacts are a media distortion caused by compression, but also caused by video manipulation – a thumbprint if you will, or at least an indication that is it in fact fake.

Another form of combat? Use common sense and do your research – don’t take one video or photo as truth. With so many news feeds, outlets and ways to get our information there now comes the need to vet your sources. Luckily because there are abundant sources, vetting is a viable possibility. So we recommend it.

We can never be too careful, and it’s become increasingly likely you’ll stumble across a highly believable “deepfake” – so we wish you all good luck while we navigate this rough sea of new information, there doesn’t seem to break in the storm.  

If you need assistance with IT Managed Services give EnhancedTECH a call at 714-970-9330 or contact us at [email protected]

–Emmy Seigler

Samantha Keller

Director of Marketing and PR at EnhancedTECH
Samantha Keller (AKA Sam) is a published author, tech-blogger, event-planner and mother of three fabulous humans. Samantha has worked in the IT field for the last fifteen years, intertwining a freelance writing career along with technology sales, events and marketing. She began working for EnhancedTECH ten years ago after earning her Bachelor’s degree from UCLA and attending Fuller Seminary. She is a lover of kickboxing, extra-strong coffee, and Wolfpack football.Her regular blog columns feature upcoming tech trends, cybersecurity tips, and practical solutions geared towards enhancing your business through technology.
Samantha Keller

Latest posts by Samantha Keller (see all)

Leave a Comment
Read previous post:
IT Disaster
Are You Flirting With IT Disaster?

Risk is for losers. Literally. If you're not actively de-risking your IT infrastructure, you're asking for a ransomware attack where...

Close