Industry executives and experts share their predictions for 2025. Read them in this 17th annual VMblog.com series exclusive. By Jeremy
Burton, CEO of Observe
2024 was the year of observability, with
enterprises understanding that they needed to think beyond the traditional
framework and adopt a holistic approach to their observability solutions. Now, headed into 2025, Observe CEO Jeremy
Burton shares his top predictions on what enterprise leaders can expect in the
coming year.
Read on for seven predictions on agentic AI
momentum, innovation in observability, "on the fly-UI," and more:
End
Of The Model Wars - The past two years have seen big
players and some well-funded startups compete to bring ever-larger models to
market. But funding is a finite resource. When asked earlier this year about
the cost of training foundation models, OpenAI CEO Sam Altman admitted it is more than $100 million. Such
costs aren't sustainable for anyone other than the hyperscalers and
infrastructure giants, particularly since there is no proven way to monetize
them. The end of the model wars will ultimately be a good thing. Less choice
means third-party ecosystem players will spend fewer dollars placing multiple
bets and can concentrate their investments on fewer alternatives.
The
Ai Investment Frenzy Comes Down To Earth - This was a
banner year for AI investment, with 35% of startup dollars going to AI companies
compared to 15% in 2021. The party won't end in 2025, but the cover charge will
get a lot higher. With less funding being thrown at foundation models, more
will target use cases that yield demonstrable value on top of those models that
drive rapid growth in annual recurring revenue. Venture capitalists are already
becoming more demanding; the median time lapse between Series A and Series B
rounds in 2024 was 28 months, the longest in over a decade. While
we aren't looking at a dot-com bubble-like collapse just yet, we can assume
that business models with solid unit economics will get the lion's share of
venture dollars next year.
The
Rise Of AI Agents In The Enterprise - Much has been
made of AI's potential game-changing impact to consumer search, but perhaps
some of the biggest short-term opportunities in AI may lie in addressing the
woeful inefficiency of many day-to-day office tasks. ERP and CRM systems may be
the backbone of the enterprise, but most of the work still gets done in emails
and spreadsheets.
Agentic AI promises to do what robotic
process automation didn't: scour through that mess, eliminate ROT (redundant,
obsolete and trivial) data and make what's left part of actionable workflows.
That isn't sexy stuff, but it will give birth to many more successful
businesses over the next five years than training ever larger LLMs.
Say
Hi To The "On-The-Fly" UI - Enterprise software users
have long had to deal with the complexities of a feature-rich but confusing
user interface. Even the revered
"Business Dashboard" is often a dizzying array of numbers and charts. A new breed
of AI-native interfaces will drive a completely new interaction model between
human and software. User interface will
be constructed "on-demand", in response to the users inputs - showing just
enough information, but no more. Every user will be comfortable with AI-native
interfaces immediately, gone are the days of training courses and mountains of
documentation.
Exploding
Costs Drive The Great Observability Shakeout -
Incumbent vendors have built their offerings on a mish-mash of proprietary
agents, proprietary databases and query languages. The result? As data volumes
have sky-rocketed so too have their bills and - worse - their customers are
locked in.
OpenTelemetry, Object Stores (such as
AWS S3) and Apache Iceberg have provided a new breed of vendors with a
standard, low cost, way to collect and store data. These products - Observe
among them - are based on a completely new architecture built on open standards
and priced in a way that doesn't result in the usual overage bill each month.
Legacy vendors will inevitably try to make it difficult for customers to
switch, but those barriers will fall. Customers should plan now for the coming
migration opportunities.
Bloom
Comes Off The LLM Rose - Despite the excitement around
AI, many will begin to realize the
limitations of large language models. While impressive at summarizing,
translating and regurgitating well-known information, these models are clearly
not the foundation for artificial general intelligence (AGI). In fact, LLMs
have arguably siphoned investment dollars away from approaches that may have
had a greater chance of success.
It's not that LLMs are useless - quite
the opposite - they are a better way for humans to interact with all kinds of
software, devices and systems and will fundamentally change many industries.
But the AI super-intelligence - the kind that interacts and reasons about its
knowledge, surroundings and people in the same way humans - that's still many,
many years away.
Knowledge
Work Will Be Disrupted - If your job involves digging
through mounds of information and synthesizing it into concise summaries, or
maybe writing in a way that only professionals of a certain level in a
particular industry would understand - then now is a good time to look into another
line of work. LLMs are clearly better than humans at distilling information -
they can absorb more, retain more and synthesize more than any human. They do
not get bamboozled by technical terms, legalese or foreign languages; they can
read and generate images, voice as well as text; and they never get tired so
they can work 24x7x365.
That potentially makes them a threat to everyone from law clerks, management consultants,
translators right the way through to interior designers and architects.
However, one human trait that is very difficult to ‘generate' is creativity and
original thought - there will always be a need for the spark of innovation, a
thesis and logical reasoning.
As observability steps firmly into its own
consolidated and expansive market in 2025, I am excited to see how the industry
shakes out.
##
ABOUT THE AUTHOR
Jeremy Burton is the chief executive
officer of Observe, Inc. Prior to Observe, Jeremy was Executive Vice President,
Marketing & Corporate Development of Dell Technologies, and served as
President of Products, overseeing EMC's $15 billion business. Jeremy joined EMC
from Serena Software, where he was President and CEO. Previous to Serena, he
led Symantec's Enterprise Security product line as Group President of Security
and Data Management. Jeremy also served as Veritas' Executive Vice President of
Data Management Group and Chief Marketing Officer. Earlier in his career, he
spent nearly a decade at Oracle as Senior Vice President of Product and
Services Marketing. Jeremy has been a member of the board of directors at
Snowflake since 2015, and maintains a seat on the advisory board at McLaren
Racing.