Industry executives and experts share their predictions for 2021. Read them in this 13th annual VMblog.com series exclusive.
In-Memory Computing as a Foundation for Accelerating Digital Transformation
By Nikita Ivanov, Co-founder and CTO of GridGain
Over
the last decade, many applications have experienced transactional workload
increases of 10x to 1,000x, a 50x growth in data, and been under constant
pressure to deliver streamlined, responsive customer experiences. Traditional
computing infrastructures are often stretched beyond their limits under those
demands, incapable of scaling effectively to handle the growth in data and the
need for real-time responses.
The global pandemic of 2020 compounded these
issues. Digitalization became an imperative, as the majority of interactions
moved online and businesses were forced to rapidly change their operations.
Some had to scale to support massive influxes of online orders. Others had to
pivot sharply to embrace digital transformation in an effort to remain
efficient when weakened economies hurt their bottom lines.
A
growing number of businesses have turned to in-memory computing as a sound and cost-effective foundation to meet these new needs and deliver
the speed, scale, real-time intelligence, and automation that businesses
require to compete in today's climate.
Below are my top four in-memory
computing-driven trends that I believe will continue to shape digital
transformations for the coming year.
1. In 2021, more businesses will
need to focus on rapidly accelerating and scaling out applications to meet the
challenges of digital transformation
In
2020, the COVID-19 pandemic drove many businesses to dramatically scale out and
upgrade infrastructure to maintain high application performance in the face of
surges in website visitors, delivery requests, sales transactions, video
streaming and more. This was especially prevalent for companies that provide
food delivery, ecommerce, logistics, and remote access and collaboration
services.
Many
of these businesses found that the fastest approach to maintaining or improving
performance while simultaneously increasing application throughput was to deploy
a distributed in-memory data grid (IMDG) - built using an in-memory computing
platform such as Apache® Ignite® - that can be inserted
between an existing application and disk-based database without major
modifications to either. The IMDG improves performance by caching application
data in RAM and applying massively parallel processing (MPP) across a
distributed cluster of server nodes. It also provides a simple path to scale
out capacity because the distributed architecture allows the compute power and
RAM of the cluster to be increased simply by adding new nodes.
In
2021, IMC platforms will become easier to use and the number of knowledgeable
IMC practitioners will continue to grow rapidly. This will enable IMC adoption
to spread across more industries and to a wider pool of companies. As a result,
more businesses will be positioned to take advantage of IMC for rapid
application acceleration, not just for response to the demands of COVID, but
also to meet new strategic and competitive demands as the pandemic threat
abates.
2. Digital integration hubs will
increasingly power digital transformations
In
2020, enterprises of all sizes continued to push forward with - and even
accelerate - their digital transformations. In 2021, many of these businesses
will want to leverage their expanding digital infrastructure to drive real-time
business processes powered by 360-degree views of customers and their business.
This demand will lead to widespread adoption of real-time digital integration
hubs (DIHs), also known as API platforms, smart data hubs, or smart operational
datastores.
Powered
by an in-memory data grid (IMDG), a DIH creates a high-performance data access
layer for aggregating a subset of data from multiple source systems. These
source systems may include relational and NoSQL databases, data warehouses,
data lakes and streaming data that may reside in public or private clouds,
on-premises data centers, mainframes, or SaaS applications. The aggregated data
in the DIH can be simultaneously accessed by any number of business
applications at in-memory speeds. The IMDG powering the DIH supports a range of
APIs, including key-value and SQL, and some even offer ACID transaction
support. A synchronization layer, or change
data capture layer, between the data sources and the IMDG ensures that the data
in the in-memory cache is constantly updated as changes are made to the
underlying datastores.
With
an IMC-powered DIH, businesses can launch a variety of critical real-time
initiatives, from predictive analysis and recommendation engines to automated
business decision making. Whether in financial services, healthcare, logistics
or a range of other industries, these initiatives can instantly react to
real-time, cross-organization data views.
3. Persistent memory will make its
way into business applications
Persistent
memory (PM) solutions such as Intel Optane ensure RAM is not flushed when the
servers restart. Instead, data that resides in persistent memory is immediately
available for processing at in-memory speeds following a reboot. Some
persistent memory solutions also offer the ability to insert more PM RAM per
server than traditional DRAM, enabling more memory in a single server, which
enables in-memory speeds for much larger datasets on the same server. Combining
persistent memory technologies with an in-memory computing platform, such as
Apache Ignite, can enable a comprehensive infrastructure for achieving high
performance and massive scalability of production applications for both
existing and greenfield use cases. This will be particularly useful for high
performance, 24x7 applications for industries such as financial services and
where high performance, cost-effectiveness, and immediate recovery from a
server restart is crucial.
In
2021, we will see a significant acceleration of adoption of persistent memory
in business applications. This will take the form of code updates to business
applications and the adoption of persistent memory as a standard option with an
increasing number of third-party solutions.
4. In-process HTAP will experience
growing adoption
HTAP
- or hybrid transactional/analytical processing - enables simultaneous
transaction and analytics processing on the same dataset. HTAP eliminates
time-consuming ETL processes required to move transactional data into an
analytical datastore and can power real-time digital business models across a
range of verticals, including financial services, e-commerce, healthcare,
transportation and many more.
"In-process
HTAP," a term created by Gartner, is an HTAP system which can apply real-time
machine learning to business processes. In-process HTAP combines HTAP with
machine learning (ML) models that are trained continuously using the system
data. Powered by an IMDG, such continuous learning environments allow real-time
updates to the machine learning model used in the associated business
applications. For example, to minimize new loan scams, a bank could
continuously update its machine learning model of what indicates a fraud attempt
based on the incoming real-time data of new loan applications. It can then
seamlessly push the resulting updated model into its production business
applications.
The maturing of in-memory computing platforms in
2020, combined with soaring demand to roll out applications that can update
their machine learning models in real-time, will drive an increasing rate of
adoption of in-process HTAP in 2021.
##
About the Author
Nikita Ivanov, founder and CTO of GridGain Systems, has
led GridGain in developing advanced and distributed in-memory data processing
technologies. Nikita has more than 20 years of
experience in software application development, building HPC and middleware
platforms and contributing to the efforts of other startups and notable
companies, including Adaptec, Visa and BEA Systems. In 1996, he was a pioneer
in using Java technology for server-side middleware development while working
for one of Europe's largest system integrators. Nikita
is an active member of the Java middleware community and a contributor to the
Java specification. He is also a frequent international speaker with more than
50 talks at various developer conferences in the last 5 years.