An estimated
40 zettabytes (43 trillion gigabytes) of data will be created by 2020 (an
increase of 300 times the amount of data in circulation in 2005), at a rate of
2.5 quintillion bytes per day. "The explosion in big data is both
good news and bad news," says James D'Arezzo, CEO of Condusiv Technologies. "It
will make possible advances in dozens of fields, but it will also present
serious challenges to the economy's already overburdened IT sector." D'Arezzo,
whose company is a world leader in I/O
reduction and SQL
database performance, adds,
"Collecting data won't be much of a problem. The real issue will be storage,
accessibility and performance."
A
major contributor to the boom in data collection and analysis, notes D'Arezzo,
is the Internet of Things (IoT), through which devices-refrigerators,
automobiles, home security systems, health monitors, etc.-communicate with one
another and with computer systems. According to experts in the data center
services industry, the next decade will be an inflection point in digitization,
in which the growth of hyper cloud providers will mushroom to meet the demand
created-in part-by the IoT.
One
field being heavily impacted by both the IoT and the need for big data
analytics is healthcare. More digital tools are being brought into health IT
ecosystems for both patients and clinicians to use. More medical images, which
take up a tremendous amount of storage space, are also being produced. Human
genome sets alone consist of hundreds of gigabytes, and the amount of sequence
data is doubling every seven to nine months. Meanwhile, many healthcare
institutions cannot afford the cost or lack the space to add enough physical
servers to keep up with the demands of big data.
In the
business sector, a major across-the-board function-marketing-has gone from
being a largely analog and promotional activity to a heavily digitized means of
delivering business growth. In the process, marketing has come to rival, and
may soon surpass, traditional IT as a center for technology spend. A recent
Gartner Group study suggests that marketing leaders now devote a portion of
their expense budget to technology equal to 3.24% of total enterprise revenue,
just barely behind the current CIO technology spend of 3.4% of revenue.
Social
media companies, most of whose business models are centered around the sale and
use of data analytics, are both major users of, and investors in,
infrastructure designed to keep up with the burgeoning universe of data. It has
been estimated that Facebook, for example, spends somewhere above $1.5 billion
per month on hosting-related costs.
All
told, it is estimated that 86 million U.S. workers now perform jobs that
require the regular use of a computer. A recent survey of global IT
managers suggests that degraded system performance due to slow processing of
database applications costs each of these workers an average of fifteen minutes
per day; at the current median salary of $58,000, that represents
an annual time loss equivalent to $1,812 per worker, or almost $156 billion
across the economy. Given the pressures of the big data boom, that productivity
loss may soon increase dramatically. But it will not increase evenly, and those
who fail to recoup endangered productivity will soon face a steepening
competitive disadvantage.
"To
keep up, not just with the overall demands of big data but with their nimblest
competitors, organizations will have to get the most out of their IT
technology," says D'Arezzo. "Part of the solution may be buying new hardware,
but there are quicker and more cost-effective things IT system managers can do.
One is to implement software that reduces input and output, which can improve
performance dramatically. We are the world leader in this area, and we have
seen users of our software solutions more than double the I/O capability of
storage and servers, including SQL servers, in their current configurations."