Industry executives and experts share their predictions for 2024. Read them in this 16th annual VMblog.com series exclusive.
5 Generative AI Predictions
By Greg Benson, Professor of Computer Science at the
University of San Francisco and Chief Scientist at SnapLogic
This
year, the world was taken by storm with the explosion of ChatGPT and related
generative AI technologies. We've witnessed the technology accomplish tasks
such as synthesizing existing information, debugging code, and generating data
pipelines in a much more efficient way than what is possible through web
searching and manually collecting information. And we're only at the tip of the
iceberg of witnessing how this technology will revolutionize the industry and
impact modern organizations.
Here
are the Generative AI predictions that I believe will come to the forefront of
the industry in the coming year.
Prediction 1 - GenAI won't
take your job, but it might change it.
In
2024, GenAI won't lead to mass job displacements and redundancies, as many
early sensationalist reports might have suggested. In fact, GenAI won't
completely replace experts in any field, as although models have access to an
insurmountable amount of information, users still have to articulate concepts
well enough to get the right answers - thus, expertise and human input and,
more importantly, human review always will be necessary.
Collaboration
with GenAI is a trend we can expect to continue in 2024 as businesses look to
capitalize even further on its benefits, reaping the rewards of increased
productivity and quality of content creation. This means adopting even more
GenAI tools and encouraging even more use with the goal of not replacing
workers, but instead assisting what they do.
Prediction 2 - Now that we've
invented GenAI, the next step is understanding it.
Next
year, we can expect to see businesses attempt to improve the consistency of
output from GenAI. Currently, there is no set rule book for achieving great
results with GenAI; there are tips and tricks you can deploy for better or
faster results, of course, but overall the process is largely trial and error.
Interacting
with GenAI in its current iteration is like a science experiment - you come up
with a hypothesis and continue to test different manners of prompts until it
produces the result you're looking for. In the future, the focus of
experimentation will be on figuring out how we evaluate the responses it gives
us and using that data to inform prompts further. We will see companies
providing tools and services to help run such experiments.
Companies
that want to apply GenAI to their products will need to think about how to
rigorously assess generative quality and put systems in place to monitor and
continuously improve their generative results.
Prediction 3 - GenAI
regulation is essential to adoption.
Regulating
GenAI will be a huge focus for governing bodies and business leaders in 2024.
The EU Parliament has taken the lead with the provisional agreement on the
Artificial Intelligence ACT. This needs to be fleshed out further, but other
governments will take notice and develop their own AI regulations.
Earlier
this year, calls were heard from numerous visionaries for a pause in AI
development, but this isn't realistic as the fundamental technology is
increasingly available through open-source models on Hugging Face and
elsewhere. Rather than focusing on halting development, creating clear
regulations, guidelines, and best-use practices will be necessary to ensure
that partnership with AI moves forward in a safe and secure way.
Like
any other technology, defining the boundaries that keep safety in mind will
allow for leveraging the benefits without sacrificing progress. We can liken
this to all manners of tools and equipment that need to be regulated; for
example, we don't stop ourselves from building cars that go really fast, but we
do put speed limits in place to ensure safety. Internationally, governments
will draw their attention first to the areas of regulation that present the
greatest impact on citizens, including frontier AI.
From
an industry perspective, the GenAI applications and use cases that are most
helpful will emerge as front runners for wider business use cases.
Understanding the risks, challenges, and security issues potentially imposed by
these tools will be vital for businesses to understand exactly when and how
these tools need to be regulated internally. Likewise, companies hoping to
leverage GenAI will have to communicate to customers exactly how it's used and
how it complies with current and future regulation requirements.
Prediction 4 - GenAI and
Legacy technology: Why the key to modernization may reside in GenAI tools.
After
a year of GenAI practice, legacy businesses are starting to understand that
GenAI interest is not just driven by ‘hype' and instead could be truly
transformative for their sector. Therefore, in 2024, we can expect even more
traditional businesses to deploy the technology to help evolve legacy systems
and modernize their technology stack.
Typically,
traditional companies are not amenable to change or agile enough to adopt the
latest in new technology. Many companies are tied to legacy software due to a
combination of outdated procurement processes, familiarity, or concerns about
data loss or disruption, making modernization inaccessible. The key here is
that GenAI can assist with migrating old code bases and technology stacks to
modern programming languages and platforms. There is great potential for such
automated migration to help companies reduce costs by moving off legacy
systems.
Prediction 5 - Universities
will begin to teach prompt engineering
In
2024, universities will teach prompt engineering as a minor field of study and
through certificate programs. While GenAI has created a bit of a firestorm in
higher education in that students can get answers to their homework problems,
it is also an opportunity for colleges and universities to help shape how
students engage with the technology to use it productively and responsibly.
Prompt
engineering for GenAI is a skill already augmenting domain experts, similar to
how computing has augmented other domains. The successful use of large language
models (LLMs) relies heavily on giving the models the right prompts. When
looking to fill the role of a prompt engineer, the task becomes finding a
domain expert who can formulate a question with examples in a specific domain,
a skill critical for today's IT professionals to refine to successfully
implement LLM applications. Given this, universities will introduce both
general prompt engineering and discipline-specific prompt engineering to
address the growing demand for professionals with the skills required to build
the next generation of GenAI applications.
##
ABOUT THE AUTHOR
Greg
is the Chief Scientist at SnapLogic and leads forward looking research and
innovation projects. He is also a Professor of Computer Science at the University
of San Francisco. Greg has published several papers in the fields of operating
systems, distributed systems, and programming languages. He is one of the
original architects of the SnapLogic programming model and cloud platform. Greg
also led the machine learning research work that resulted in SnapLogic's Iris
AI.