Industry executives and experts share their predictions for 2024. Read them in this 16th annual VMblog.com series exclusive.
AI Adoption in 2024: From Grade School Math to Enterprise Grade Solutions, From Research to ROI
While
LLM's face increasing headwinds in 2024, AI adoption isn't slowing down and
will look different. And with a new look: real ROI and productivity gains.
By
Jeff Schumann, CEO & Co-Founder of Aware
2023 was the year of AI enterprise adoption,
with 55% of organizations adopting AI into their
workflows according to a recent report from McKinsey & Co. This adoption
has been led by Large Language Models (LLMs) that promised to fulfill numerous
use cases across the digital workplace, from research to drafting entire
deliverables. However, the failure of LLMs to meet the needs of enterprises -
and perhaps not even live up to their reputation - will be the story of 2024.
Over the past year, many companies have
experimented with incorporating AI into their workflows, but long-term success
relies on the ability of AI to solve specific business use results that deliver
tangible business results. Simply put, LLMs are failing to meet those
expectations, with benchmarks as rudimentary as how quickly their algorithm can
solve grade school math problems. There are growing concerns around the
quality, accuracy, and security of these models, to the extent that
companies are already prohibiting their employees from using ChatGPT to shield
their data, and the broader market is filing lawsuits to prevent the use of their data
for model training.
These ever-growing concerns call into question
the long-term sustainability and financial viability of LLMs, which take
billions of tokens to train. Without a steady influx of good, clean and cheap
data, it will become increasingly difficult and expensive to build, deploy, and
refresh models. Adding to this pressure is the ongoing GPU shortage, impacting
the computing capabilities and running costs of AI models, with some companies
having to wait almost a year to access these chips. Combined
with challenges of hallucinations, data privacy, ethics, data traceability, and
responsible AI, you have a perfect storm of headwinds facing LLMs going into
2024.
Will AI adoption slow as a result of these
challenges? At Aware, we predict the opposite, with 2024 being the year of more
innovation and new approaches. AI is here to stay, and it will look different.
Enterprises' unrelenting drive for increased productivity and ROI will lead
companies to pursue a hybrid strategy - a coexistence of open-source,
closed-source and custom,targeted language models trained on very specific
internal data sets for very specific use cases.
Each of these AI approaches have their own
benefits:
- Open-source MLOps tools have
gained significant traction due to their flexibility, community support, and
adaptability to various workflows.
- Closed-source platforms often
provide enterprise-grade features, enhanced security, and dedicated user
support.
- Custom, targeted lLanguages
models, built on the unique data of an enterprise, are able to provide the best
solutions for an enterprise and their specific processes.
Open-source options are like Formula One race
cars - fast, agile, and require
technical expertise. Closed-source platforms are like luxury sedans: powerful,
secure, and come with concierge service. Targeted Language Models are like
classic car restorations built on a unique knowledge, and through a set of
experiences that is unique.
This hybrid approach will be especially
important when the company's proprietary or sensitive data requires stricter
controls to meet compliance and legal obligations. Targeted models can help
teams develop intellectual property around machine learning as a competitive
advantage, training them on closely curated datasets while reducing reliance on
large engineering teams or GPU instances that can add cost and complexity. Combined with the
judicious use of larger AI models when appropriate, businesses can invest in
solutions that fulfill their specific needs.
In 2024, companies will also appreciate the
immense value locked within their own data - and now more tools to unlock that
value . Yes, data was recognized as
valuable but a new gold rush will emerge as companies look for solutions to
manage unstructured data. Data serves as fuel for
LLMs: data traditionally sourced from end-user prompts, books, articles, social
media sites and more. This method of training models provides the broad plane
of knowledge LLMs are known for but raises data leakage and security concerns.
Failure to holistically manage this data's usage can create blind spots that
bad actors can attack and can jeopardize a company's place in the market. These
blind spots can be found in almost any internal data reservoir. Worst of all,
this accessible data could house troves of personal identifiable information, leading to
serious future compliance issues. To
address these issues, as many as 75% of businesses worldwide are now beginning to prohibit the use of LLM
solutions like ChatGPT - , in hopes of identifying solutions that can better
protect their ingested data.
This emerging drive to secure proprietary data
has brought the sheer volume of enterprise data to the forefront. This data
exhaust, originating from anything from collaboration data to support tickets
to customer survey data, holds deep insight into the risks and opportunities
that sit within a business. Recognizing the value of the data they hold,
companies will seek to secure it by taking a "hybrid cloud by design" approach,
rather than "hybrid cloud by default."
Ultimately, data protection will emerge as a key pillar in a successful
2024 AI strategy, and companies will move towards prioritizing AI solutions
that are trustworthy and responsible. By adopting a hybrid approach, with both
AI platforms and clouds, enterprises will not be putting all of their data-eggs
in one basket.
Beyond internal data troves, companies in 2024
will begin using AI-powered tools to more proactively analyze external sources
in order to gauge customer, employee and the general public sentiment. This
data is begging to be harnessed, and companies will be looking for an
opportunity to extract these insights.
By analyzing content on public external platforms -
Reddit, for example - companies could be made aware of issues months years in
advance. The same is true for competitors, who will be able to harness this
publicly available data, giving them a leg up on the perceptions of their
biggest competitors.
Flexibility is the word for tech in 2024.
Flexibility in AI platforms, in data usage, and most importantly in data
sources. By adopting this mindset, enterprises in 2024 will harness insights
they could only dream of this year.
##
ABOUT THE AUTHOR
As the Chief Executive Officer and
co-founder of Aware, Jeff pushes his
team to think beyond the possible. Forward-thinking and often labeled a
visionary, Jeff successfully built more than 10 companies throughout his career
- starting in middle school, continuing through college and even while holding
a highly-prominent position at a Fortune 100 company. His work has been
recognized by Forbes, CIO Magazine, Fortune, the Harvard Business Review,
Gartner and the Wall Street Journal.