
Industry executives and experts share their predictions for 2018. Read them in this 10th annual VMblog.com series exclusive.
Contributed by Kaladhar Voruganti, VP, Technology Innovation and Senior Fellow at Equinix
8 Predictions for IT
IoT, blockchain,
Artificial Intelligence and data sovereignty are but a few prominent industry
game changers we see proliferating in 2018. We connected with Kaladhar
Voruganti, VP, Technology Innovation and Senior Fellow at Equinix, a company
that stands at the interconnection junction between network providers, cloud
providers and the enterprise, for a unique perspective on IT trends and where
things are headed.
Private
blockchain networks take hold and multiply
The public
blockchain at the heart of various cryptocurrencies is very different from the
private blockchain we see appealing to more companies in 2018. The key
difference is that private blockchain networks are "permissioned," rather than
open, so private blockchain networks have more secure digital identity
management, greater levels of trust and the ability to permit far greater
levels of transactional throughput. According to "Top Trends in the Gartner Hype Cycle for Emerging
Technologies, 2017," "of the two types of blockchain - permissionless-public
ledgers and permissioned-public ledgers - enterprises should look
toward the latter option. Permissioned-public ledgers have access controls
owned/managed by rules, but still allow for a community. For commercial transactions,
companies might look to permissionless-public ledgers such as bitcoin, which
allows unknown or untrusted users to access the ledger."
Organizations that
provide blockchain networks for various business verticals will need to host
blockchain ledger nodes in multiple distributed locations to ensure low
latency. In addition, enterprises in some private blockchain networks need to
simultaneously learn about the completion of time-sensitive blockchain
transactions. This is enabled by bringing enterprises together inside
developing blockchain ecosystems. Finally, at any given point in time,
enterprises will be involved in multiple blockchain networks (e.g. supply
chain, finance, etc.), and they will want their business systems to be located
close to these different blockchain network nodes.
Artificial
Intelligence-based applications surge into the mainstream
Artificial
Intelligence (AI) technology has existed for six decades, but it's just now
going mainstream. Why? The advent of big data, AI-focused processors and deep-learning
algorithms has enabled AI to power advances like smart homes, factories and
cars, which have broad appeal. In September 2017, IDC forecasted that worldwide revenues for cognitive and AI
systems in 2017 would reach $12.0 billion, and we see it making even deeper
inroads into the mainstream in 2018.
But AI systems need
to interpret and fuse data from multiple sources, and they also need to be
distributed, with model building happening in clouds and model deployment based
at edge locations to satisfy real-time processing requirements. Regulatory
bodies are also becoming more interested in ensuring AI apps comply with
security and data residency laws.
Internet of Things apps accelerate computing shift to the
digital edge
Gartner, Inc. forecasts that 8.4 billion connected things will be in use
worldwide in 2017, up 31 percent from 2016, and will reach 20.4 billion by
2020. In 2018, as these devices multiply, the computation needs at the
heart of the Internet of Things (IoT) will increasingly shift to the digital
edge.
Maintaining low
latency is one of the main reasons companies are moving large amounts of data
from IoT devices closer to cloud processing and analytics at the edge. But
placing interconnection at the edge will also save on network costs, as
companies filter out volumes of useless IoT data near the source to gain the
faster access to valuable insights needed by IoT-enabled innovations like smart
hospitals. And in growing numbers of regions, data must be processed at the
edge to comply with data residency requirements.
A new architecture for subsea cable systems becomes
prominent
Subsea cables are a
key component of the internet, as nearly all global data traffic touches one,
and investment in new cables is increasing amid massive projected growth in
that traffic. According to "Supply and demand," from the Global
Bandwidth Research Service, TeleGeography, 2017, global subsea cable construction
costs are expected to exceed $2 billion in 2018 for the third straight year,
after not topping $1 billion since 2012. During this building boom, we see a
new architecture taking hold that will improve costs, deployment agility and
interconnection benefits.
Advances in laser
technology have enabled subsea cables to bypass cable landing stations on the
beach and directly land in retail multi-tenant data centers. For us, this means
customers of subsea systems that land inside Equinix data centers that support
this model get direct, low-latency access to the numerous industry ecosystems
we host. That increases a subsea cable system's appeal to its potential
customers.
SDN/NFV technologies transform wide area networks
A fundamental shift
in enterprise acceptance of software-defined networking (SDN) and network
function virtualization (NFV) technologies is underway, and it's changing how
large enterprises architect their wide area networks (WANs) to increase their
cloud access services. This shift will gain strength in 2018.
Enterprises can no
longer afford to backhaul traffic on expensive MPLS networks from their branch
offices to a centralized location, where they apply network security policies
on physical networking gear before accessing the cloud. Instead, they are
sending their traffic to regional hubs, via more cost-effective SD-WANs
technology. There, they are applying security policies on virtual network and
security appliances that leverage NFV technologies.
IDC's most recent forecast for the worldwide
Datacenter SDN market noted it will expand at a compound annual growth rate of 25.4%
between 2016 and 2021, when it will amount to nearly $13.8 billion.
Companies look for a global data center platform to meet
emerging data sovereignty and auditing mandates
Several major data
privacy, security and sovereignty regulations will be enacted in 2018, and all
have major implications for businesses:
- The General
Data Protection Regulation (GDPR) restricts data transfer in the
EU to countries that are GDPR-compliant, which could affect transfers
between EU companies and important international business partners.
- Consolidated
Audit Trail (CAT) reporting in the U.S.
requires companies to log every securities transaction and ensure the
accuracy of timing services at the nanosecond level.
- The Markets
in Financial Instruments Directive (MiFID ii) in Europe imposes
new reporting requirements and tests on investment firms.
The enforcement of
GDPR and similar data sovereignty laws across the world will make it necessary
for organizations to have data centers in multiple regions in order to store
data locally. CAT and MiFID II laws also require organizations to log financial
transactions at a granular level. This, in turn, makes it necessary for
organizations to have a finely synchronized internal clock system across
multiple data centers.
The multicloud trend leaves companies in need of hybrid IT
platform
In "The Future of
the Data Center in the Cloud Era," from Gartner, June 2015, refreshed Sept.
2016,Gartner says "a multicloud strategy will become the common strategy for
70% of enterprises by 2019, up from less than 10% today." Respondents to an IDC survey two years ago put the percentage even higher - and the
timeframe sooner - with 86% indicating that they would need a multicloud
strategy by 2017 to complement their future business goals. As multicloud
becomes more prevalent, we see an increasing enterprise need to bring its
management under control on a hybrid IT platform.
Companies are distributing
their applications over multiple clouds, based on the best cloud for the job.
In addition, businesses are increasingly relying on redundant clouds to support
business continuity and disaster recovery initiatives. This requires a
multicloud strategy that can be deployed on a hybrid IT (on-premises and cloud)
infrastructure.
Digital transformation (DX) platform strategies proliferate
across enterprises
IDC describes a digital transformation (DX) platform as enabling
the "rapid creation of externally facing digital products, services and
experiences, while aggressively modernizing the ‘intelligent core'
environment." And they predict that by 2020, 60% of all enterprises will
embrace an organization-wide DX platform strategy.
To accelerate this
shift, we see DX platform companies investing more in open application
programming interfaces (APIs) and customer/partner-facing development portals
in 2018. An external-facing, API-enabled DX platform speeds up the development
and integration of digital services and enables more intelligent tools for IT
automation and orchestration. It also promotes innovation and faster
time-to-market for new solutions. On DX platforms, businesses can collaborate
within ecosystems made up of interconnected customers, service providers and
business partners. As companies make services available via an API-driven DX
platform, many will have to satisfy performance, availability and
security-related SLAs for it.
##
About the Author
Kaladhar Voruganti is currently Vice President Technology Innovation and Senior Fellow at Equinix. He previously worked at IBM Research and NetApp CTO office.
He obtained his BSc in Computer Engineering and PhD in Computing Science from University of Alberta, Canada. He has 70 patents filed/issued in the area of network and storage systems. At Equinix, he is actively working on distributed AI/Analytics and Blockchain architectures. He is also in charge of company wide employee innovation program.