Industry executives and experts share their predictions for 2020. Read them in this 12th annual VMblog.com series exclusive.
By Kevin Woods, Head of Product Marketing at Scalyr
Emergence of Performance Information Platforms
Improving performance, perpetually and
predictably, is what modern digital business is all about. Or at least it
should be. Understanding performance, in real time, is the difference between
the winners and the rest. Insight into the performance of digital systems,
products, users, capacity needs, etc. is what creates competitive advantage,
user satisfaction, and new product innovations.
Gathering and analyzing performance
intelligence allows us to understand, measure and improve every aspect of the
system, product, and business performance. Performance intelligence is about
collecting performance information and making it easily available to those who
can benefit from it. Many businesses today don't have a performance
intelligence platform (PIP) that provides affordable, real time access to
performance data, let alone one that is accessible to a high percentage of the
people that can make use and sense of the data.
This means they could be under performing relative to their full
potential, and it means there may be considerable upside to capture across the
business.
The next year will see the emergence of
performance intelligence platforms with real time insight into business,
system, and user performance. The PIP is a system or platform that collects,
stores, and accesses real time and historical performance data quickly and
affordably, and makes it easy for people to use the data. Platforms, in the
truest sense of the word, are the foundations upon which other things can be
built. This means: easy and intuitive
interface, snappy results, and low cost. Anyone should be able to use the data
to gain insight, just like we've seen with the rise of business intelligence
platforms.
Unlike the predecessors of PIP, which siloed
performance information into separate and limited databases, the PIP will be
the convergence of tools and data into a single ecosystem, without today's
painful tradeoffs in cost and performance.
The architectural necessity for an integrated
data store has a number of benefits to users. First, users of various tools
will be looking at the same underlying data to evaluate performance and solve
problems. For example, if an SRE notices a user-response time problem using
metrics, the engineers that have to read traces and logs to identify the problem
will be using the same source log data. This single context brings consistency
and fosters better collaboration between teams.
Second, using PIP allows DevOps teams to
choose the right tool for the job. SREs
looking at response times may favor a different tool, even from a different
vendor, than software engineering trying to debug code. Use of a common data
store alleviates the need to try and buy integrated solutions from a single
vendor who may, for example, have strong monitoring capabilities, but not have
a strong log management offer.
Last but not least, machine learning
algorithms can be better optimized when using comprehensive granular data from
as many sources as possible. When users need to shorten storage durations or
filter data to save costs, this may impede the ability of AIOps tools to do
their job. A PIP will be the best choice for future ML/AI algorithms.
Why Now?
The costs and complexities of managing
performance data today inhibits use, which means organizations are
underperforming their full potential. There is nothing in the laws of physics
or of business that keep these problems from being solved.
If you face competition for customers in your
industry, then the cost of doing business without
a PIP is high. You may underperform across every aspect of the customer's
experience, spend more money to deliver the experience than necessary, and take
longer to correct problems, thus impacting customer loyalty, trust, usability
and brand value. A leap ahead of the competition happens with insights into
customer needs, system behaviors, capacity requirements, and trouble spots,
creating new business advantages in the new year.
It is possible to ingest and cost-effectively
store performance data and instantly access and search data in real time.
Collecting and searching performance data has been too slow and expensive
because first generation architectures are limited in scale and speed. The rise
of second generation architectures provide affordable scale and provide a
reason to celebrate as we roar into the 2020s.
##
About
the Author
Kevin Woods is Head of Product Marketing at Scalyr,
the Log Management and Observability company. He is fascinated with how
emerging technologies impact people's jobs and skill sets. Kevin has been working
on cutting-edge software in the IT Infrastructure and management industry for
more than 25 years, including at well-known companies such as Cisco, Brocade
and Avaya.