By Brian Wood, Director and Cloud
Advocate, Teradata
When
it comes to data architecture choice, the options are numerous: On-premises,
public cloud, private cloud, hybrid, and multi-cloud. However, navigating
multiple architectures and cloud providers while trying to understand what
infrastructure provides the biggest ROI isn't easy, and it presents a new set
of challenges, particularly when determining the total cost of ownership (TCO).
But
the bottom line isn't the only consideration. Database administrators and IT
staff need to not only be able to quickly and seamlessly accommodate requests
by leveraging robust data sources, but they also need to be able to easily
manage and scale their data architectures so they can keep pace with their
organization's growth now and in the future.
With
choice comes change, whether it be upgrading legacy systems or completely
opting for a whole different data ecosystem. And while change may be daunting,
doing nothing is even more costly. It's essential for CIOs and CTOs to
understand the nuances of data architecture - and the cost-benefit of each
option - while also being mindful of the expensive pitfalls if an enterprise
opts for an architecture that is not aligned with an organization's goals and
needs.
Examining the Status Quo
Sticking
with outdated, backend architecture (i.e., the status quo) is the most
expensive option. This may seem counterintuitive but the same principles apply to
the new vs. old car argument: You may think a new car will cost far more than
keeping your old one, but innovations in automobile reliability and fuel
efficiency prove it's actually less expensive when you take the TCO into
account.
If
an organization opts for the status quo, IT teams can expect:
- Missed opportunities. Sticking with a legacy system means
missing out on opportunities to better understand customers, supply chains,
inventory levels, financial risk and more, as outdated technology doesn't
support modern enterprise analytics.
- Lack of support for new data types. Data types and sources have been
multiplying, such as clickstreams, social media feeds, digital twins, and
IoT-based sensor data. Using older software limits the ability to incorporate
these data streams and unlock new insights.
- Previous-generation systems can hold the business back. Legacy systems typically
have fixed coupling between compute and storage, which means changing one
requires changing the other, creating inefficiency. Modern systems now have
separation of compute and storage, enabling each to be scaled independently and
only when needed.
The Hybrid Approach
Leveraging
a hybrid data architecture, which involves both cloud and on-premises support,
is ideal for organizations in highly regulated sectors because they are tasked
with meeting increasingly stringent privacy requirements. These sectors, which
include healthcare, financial services and government, also benefit from a
reduced total cost of ownership, compared to sticking with outdated, legacy
systems.
A
key aspect to consider when planning a hybrid cloud architecture is data
movement between the cloud(s) and on-premises. While moving data into the cloud
is free, moving data out of the cloud can be expensive. In addition,
communication in and out of the cloud adds considerable latency. The core analytics should be done
on a single cloud or on-premises, and it all boils down to the data centricity
on the cloud, the context, and a common tool chain for analytics.
Many
companies are using both hybrid and multi-cloud architectures to mitigate risk.
By distributing applications, services and workloads across clouds, an
enterprise can architect for resilience. By picking and choosing where
workloads live and run, enterprises can more easily tailor for performance.
More importantly, a multi-cloud architecture helps enterprises avoid being
locked into one cloud provider and become more resilient and agile.
The Hidden Costs of the Cloud
For
some enterprises, especially those who prioritize flexibility, speed and
agility, going all in on the cloud may be the preference. However, enterprises must also be vigilant
about the "hidden costs" associated with the cloud, including:
- Deceptively low price quotes. Don't be fooled by low-cost options.
If you want basic, default cloud storage options, this might be OK. However,
decision makers often underestimate the full investment needed in the cloud as
they scale up analytics initiatives. This can result in ballooning, unexpected
costs for robust enterprise analytics programs.
- How much does it really cost? Executives should examine all factors related to cloud data warehouses,
including data size/nodes, separating storage and compute, and the impact of
concurrency demand to truly understand how much demand and investment is
needed.
As
IT budgets will be squeezed in uncertain times, it is important to understand
the true total cost of ownership before ultimately deciding what data
architecture is ideal. The best choice for any one organization is unique, and
business decisions are rarely based only on financial metrics. Strategy, timing
and other competing priorities must also be considered.
At
the end of the day, this technology is the backbone which powers enterprise
decision-making and companies should invest accordingly, albeit wisely.
##
About the Author
Brian Wood is director and cloud advocate at Teradata.
He has over 15 years' experience leading all areas of technology marketing in
cloud, wireless, IT, software, and data analytics. He earned an MS in
Engineering Management from Stanford, a BS in Electrical Engineering from
Cornell, and served as an F-14 Radar Intercept Officer in the US Navy.