Virtualization and Cloud executives share their predictions for 2013. Read them in this VMblog.com series exclusive.
Contributed article by Tal Klein, senior director of products at Bromium
Security Predictions 2013 - Malware Cross-Pollination and Next-Generation Virtualization
1. Advanced persistent threats
will adopt antigenic shifts and, akin to the Avian and Swine flus, jump
species. We will see malware that begins life on one platform, or OS, and then
hops on to another.
one device to another has already transcended the backup use case. As
applications move beyond "living" on a single device, the likelihood of
targeted malware taking advantage of syncing in order to move from unprivileged
devices onto privileged ones becomes very real. In this post-Stuxnet world
where malware propagates from Windows PC laptops to Siemens S7-300 manufacturing control systems through exploits
in the Siemens control software, is it that much of leap to imagine that
malware can propagate from phone to laptop and from laptop to tablet through
operating system and application vulnerabilities?
If enterprises continue to rely
exclusively on technologies that try to detect malware as the mechanism for
removing it, they're dooming themselves. These types of attacks can remain
dormant for months before reaching their intended targets, and removing them
after they strike may be pointless as the damage has already been done.
2. We will begin to hear
industry chatter about "smart data" that will, among other things, keep
in-depth provenance metadata about its interactions and whereabouts, and be
responsible for making dynamic, intelligent decisions for itself regardless of
the device or file system where it resides.
similar to my previous prediction, is because I don't believe MDM's reach will
ever be all-encompassing. Sooner rather than later, IT teams will awaken to the
fact that sensitive data is what they should be focused on protecting, not
devices, operating systems, or even applications.
like DLP and MIM are non-starters. They require someone to identify the data as
sensitive via policies that are then propagated via a centralized
infrastructure to components on every end-point and in every app that then
understand and respect the policy. In general, the way these things end up
getting deployed is that more data than necessary is "protected" and it impacts
users' ability to get work done, driving them to find ways around the policy -
and find ways they will.
needs to be the year data begins the journey toward becoming cognizant of its
sensitivity, context and provenance without reliance on connectivity, operating
systems, or administrative action. I think this can be accomplished with
virtualization if we're willing to accept that, just as the first generation of
virtualization abstracted operating systems from hardware, the next generation
of virtualization must abstract data provenance from file and operating systems.
I don't think we "get there" in 2013, but it's when the idea begins to gain significant
3. The end of the "Signature
Era". 2013 will be the year when detection as a mechanism for protection shifts
from commodity to extinction.
standard method for determining which company's information security technology
is better than another's is to run various tests that measure how good each
program is at detecting malware - it's essentially a signature arms race. This methodology
is expired. It is plainly obvious that next generation information and
infrastructure attacks are becoming undetectable. Advanced persistent threats
contain multiple payloads, targeting more than one vulnerability and engaging
whitelisted vectors that prey on our org structures and social relationship. As
detection-based tools turn up their sensitivities in vain to try and keep up
with these new attacks, they also increase the rate of false positives, as we've
seen recently when more than one vendor misidentified and quarantined essential
applications (in some cases their own agents) as malware.
as chain mail became extinct with the age of gun powder, the enemy's new
weapons beget a new class of tools that do not rely on detection in order to
protect. If you look at crime scenes, detectives are the people you call after
the crime to determine what happened, they aren't the people you put in place
to prevent the crime prior to it taking place. To that end, I believe detection
will continue to have utility as a mechanism for attack forensics, but not
About the Author
Tal Klein is Senior Director of
Products at Bromium. Previously, he managed technical marketing for the Desktops
& Apps Group and integrated product strategy at Citrix where he developed
cross-platform technologies focused on virtualization, autonomic computing and
cloud. Prior to Citrix he led the Technical Marketing team at NetScaler (which
was acquired by Citrix). Tal has also spent over a decade in the hosted
datacenter industry developing managed cloud services. He is an author of several
research papers and pending patents.