Virtualization and Cloud executives share their predictions for 2016. Read them in this 8th Annual VMblog.com series exclusive.
Contributed by Ed Franklin, VP of Global Marketing, TmaxSoft, Inc.
Modernizing mainframe data to leverage virtualization and the cloud
In 2016, there will be a new urgency to modernize mainframe
data to leverage virtualization, play with today's cloud architectures, and power
innovative big data and analytics initiatives. This can be a scary proposition
for enterprises that have relied on their mainframes for decades, especially in
data-intensive industries such as insurance and financial services. But the
ability to modernize that data is now a business imperative. I predict 2016
will be a tipping point as organizations realize they can longer afford to be
constrained by the limits of their mainframe and legacy systems.
Mainframes
will be left behind
Over the last 25 years, mainframes have been a solid
investment in term of reliability, performance and security. But the file
structure of mainframe data is simply not compatible with the modern business
paradigm of cloud, analytics, and affordable, high performance x86 commodity
hardware. Many mainframes are running COBOL, PL/1, or other languages difficult
to support. They cannot be easily expanded or quickly adapted to serve new
market opportunities. In fact, many users may employ mainframe software and
tools whose vendors may no longer exist, so maintenance can be an issue as
well. Mainframes also demand an unreasonable amount of resources in terms of
power and cooling costs, issues not important 25 years ago but very important today.
Making
mainframe software a VM and utilizing hybrid cloud
Many companies move to cloud for increased capacity and
higher ROI from their commodity x86 systems. These advantages will also be
significant drivers for modernizing mainframe deployments as well, separating
database and application tiers and employing modern SQL based databases. The
result is a legacy application with better scaling, higher reliability, improved
integration with big data analytics, and even up to date mobile interfaces.
Formalizing
the decommissioning of mainframes
In 2016, organizations will realize that the process of
decommissioning mainframes is just as important as the introduction of new
systems. Smart planning cycles and formalized processes are required. Many
mainframes have been in operation for a quarter of a century, and organizations
may no longer know exactly what is in them. Attempting to rewrite the entire system
and its business logic can be risky. Many
who seek to mitigate that risk look first to simply change the underlying
infrastructure, which can provide both architectural consistency and which can
eliminate the mainframe hardware.
2016 will be the year that mainframe data makes its way to
cloud, mobile and analytics platforms. From both business and IT perspectives,
any delay or denial in moving forward will have serious consequences.
##
About the Author
Ed
Franklin is the VP of Global Marketing at TmaxSoft, a multinational software
provider with U.S. headquarters in Chicago, specializing in mission-critical
enterprise infrastructure software solutions. Ed is imbued with 20 years of
enterprise software experience, and having served previously as a Senior
Manager for Product Marketing at Oracle and as a Senior Product Marketing
Manager for Fujitsu America.