...

18 April, 2022

The dangers of deepfake technology

By now you’ve probably heard of deepfake technology, or seen it in action without knowing what it’s called.

By now you’ve probably heard of deepfake technology, or seen it in action without knowing what it’s called. Whether you’ve used it to put your own face into random Star Wars scenes using an app like reface or , or seen the recent viral video of Tom Cruise leapfrogging Keegan Michael-Key, most of us have experienced this relatively new technology even if you weren’t aware of how these images and videos were created.

While on the surface this all seems like harmless fun, the implications of this technology as it continues to evolve are a huge risk. What happens when someone uses this technology to impersonate world leaders or other people of influence? The dangers are wide-ranging and represent a new threat in the world of cybersecurity. So how can we protect ourselves against this? How can we tell what’s real and what’s not?

What is deepfake technology?

First off, what even is deepfake technology? Put simply, deepfake technology is photoshop for video. Using a form of artificial intelligence called deep learning, this technology maps out a person’s face and seamlessly places it onto another person. While the act of faking content is not new (people have been replacing faces onto images for years), deepfakes take this to the next level by incorporating artificial intelligence and machine learning to create content on a new level. With this technology, images and video can be mass-produced with little manual effort. In the past, even manipulating a single image in photoshop took some level of skill and effort. Today, long-form videos can be manipulated at the click of a button.

This technology has potential beyond just manipulating images and video too. A similar technology called voice cloning or synthetic voice, uses AI technology to generate a clone of a person’s voice. Using audio recording of the person you are intending to replicate, the AI measures everything from tone and accent, to mannerisms and speech patterns, creating a nearly indistinguishable voice clone.

Why is this dangerous?

At first glance, this seems like a fun, even useful tool. Even big-budget television shows like the Mandalorian have used deepfake technology to bring back younger versions of beloved characters. But this technology’s benefits are minimal compared to the potential dangers it invites. Today, this technology has been used for pornography, fake news, hoaxes, bullying, financial fraud, and more.

Recently, a deepfake of Ukranian President Volodymyr Zelensky calling on his soldiers to lay down their weapons made its way to a hacked Ukraninian news website, and subsequently went viral before it was caught to be a fake. While the exact damages of this incident remains unknown, it demonstrated one of the most dangerous applications of this technology. If this video hadn’t been quickly flagged as fake, it could have led to lost lives and changed the course of the conflict.

On a more intimate and personal level, this same technology can be used as a form of phishing in the workplace. While commonly used in text form to impersonate your boss or a coworker, hackers can now hop on a Zoom call with you, impersonate your boss, and get you to purchase items or give up sensitive information.

What can you do?

The simple answer is, you should always approach content on the internet with a degree of skepticism. While this has been standard advice for many years now on the internet, this potential for misinformation spread from deepfake technology has only amplified the need for browsing with a general sense of suspicion.

To safeguard yourself against falling victim to a deepfake scam, the same advice we give in regards to phishing scams holds true. First, check the source or sender information. You’re much more likely to find legitimate information on trusted news sources than from social media sites. The same goes for videos sent to you via email.

Second, be on the lookout for urgent requests or negative consequences. Scams generally rely on people not taking the time to validate their legitimacy, and the best way to get people to skip over this is instilling a sense of urgency.

Finally, when in doubt, call the organization or person you think might be impersonated. If you get a weird video from your bank manager asking you to send them bank information, call your bank and verify this is real. If your boss asks you to do something you think might not be real, shoot them an email, call them, or verify in some way their request is legitimate.

Learn more about how to spot a phishing scam here!

It’s not all bad

While we want you to be vigilant and prepared in case you’re targeted using this deepfake technology, we do want to emphasize that this technology isn’t just an evil tool being used for horrible purposes everywhere. As with most technology, it has the potential for misuse, but it’s also the potential for some pretty amazing things. One example of this is Project Revoice. Project Revoice is a non-profit aiming to give voices back to those suffering from the loss of speaking caused by ALS. With this technology, people suffering from ALS can still type out what they would like to say and have it played exactly as they would have said it, ensuring no one is ever fully robbed of their own voice.

Interested in other developing technologies? Have a question about how you can better protect yourself from deepfake or phishing scams? Contact us today to speak to one of our tech experts at Tech Superpowers and learn how we can help protect and prepare your organization for the future.

You might like this too.