Virtualization Technology News and Information
Article
RSS
Forcepoint 2020 Predictions: Deepfakes will increase the effectiveness of ransomware and be used to interfere with elections

VMblog Predictions 2020 

Industry executives and experts share their predictions for 2020.  Read them in this 12th annual VMblog.com series exclusive.

By Audra Simons, Director of Innovation, Forcepoint X-Labs

Deepfakes will increase the effectiveness of ransomware and be used to interfere with elections

Throughout the last two years we have seen the popularity of an application that could accept a photograph as an input, apply various machine learning algorithms to that image and then output an image showing an aged version of that individual, amongst other filters.  Samsung researchers took this one step further by building the capability to derive a reasonably realistic video from just one still image of a subject. These capabilities showed the power and appeal behind the current fascination behind human image synthesis.  

Deepfakes was a term that was coined in 2017 and relates to fake videos being created by deep learning techniques.  We expect deepfakes to make a notable impact across all aspects of our lives in 2020 as their realism and potential increases.  Our prediction is fourfold:

  1. Ransomware authors will send targeted deepfakes to ransomware targets. Recipients will see realistic videos of themselves in compromising situations and will likely pay the ransom demand in order to avoid the threat of the video being released into the public domain.
  2. It is well known that Business Email Compromise/Business Email Spoofing has cost businesses billions of dollars as employees fall for the scams and send funds to accounts in control of cybercriminals.  In 2020 deepfakes will be used to add a further degree of realism to the request to transfer money.
  3. We have already seen deepfakes in the political arena in 2019.  With the 2020 United States presidential elections due in November 2020 we expect deepfakes to be leveraged as a tool to discredit candidates and push inaccurate political messages to voters via social media.
  4. We will see Deepfakes As A Service move to the fore in 2020 as deepfakes become widely adopted for both fun and malicious reasons.

Scammers will continue to be successful as they adjust their social engineering techniques. It is not realistic to expect every employee or member of the public to recognise a deepfake, especially as their realism advances as the technology improves.

Incorporating deepfakes into employee cybersecurity awareness programs can help to raise the bar that scammers must reach in order to conduct a successful scam. Extra checks at a process level (e.g. money transfers) can help identify unusual activity associated with Business Email Compromise (BEC) / Business Email Spoofing (BES) scams.  Also consider Web Security solutions and Email Security solutions to prevent interaction with initial lures.

##

About the Author

Audra Simons 

Audra Simons is the Director of Forcepoint Innovation Labs. Under Audra's leadership, Forcepoint Innovation Labs produces cutting-edge technology and service innovations. Her business unit's primary purpose is to create products and services that will drive the future of cybersecurity. Audra instills a culture of innovation and closely partners with customers and internal leaders to drive innovations that solve critical business problems. Before joining Forcepoint in 2017, Audra was Head of Innovation at FluidOne, a data delivery network, and mobile service provider. Prior to this, she has held various roles in global and national services providers driving and delivering innovation and products, rationalization and best practices. 

Published Friday, November 22, 2019 7:41 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
top25
Calendar
<November 2019>
SuMoTuWeThFrSa
272829303112
3456789
10111213141516
17181920212223
24252627282930
1234567