Virtualization Technology News and Information
Article
RSS
Couchbase 2024 Predictions: Developers Will Uncover the Power of AI and Large Language Models, While Investments in New Buzzworthy Tech Prevails

vmblog-predictions-2024 

Industry executives and experts share their predictions for 2024.  Read them in this 16th annual VMblog.com series exclusive.

Developers Will Uncover the Power of AI and Large Language Models, While Investments in New Buzzworthy Tech Prevails

By Laurent Doguin, Director of Developer Relations and Strategy, Couchbase

AI and large language models (LLMs) have the power to change the game for developers - they can help develop code faster, automate debugging and testing, and increase overall productivity. But the hype won't pay off overnight. Developers need to ensure AI-based solutions meet their needs, and this will require fine tuning to personalize LLMs. Low-code and no-code tools and adopting a decentralized approach to training AI models can also address organizational-wide needs for AI solutions, like privacy and security. Additionally, in the midst of the AI storm, investments in other buzzworthy technologies will grow.

Developers will brush up their Python skills to keep up with LLMs

While it was fairly easy to skip the Kubernetes hype and other infrastructure-related trends, AI and the focus on LLMs will be nearly impossible for developers to ignore.

For example, improving developer productivity is critical to the success of any technology organization. Each developer has different tasks, different stacks to work with and requires different things from various LLMs. As a result, developers will need to start fine-tuning LLMs to their specific needs, and Python is the default programming language to achieve that.

Generating code with AI will be easily accessible through low-code/no-code

Everybody's talking about generative AI and LLMs and for good reason. However, there are still many unanswered questions with regulation, privacy, security, intellectual property and more.

Developers' natural response to these challenges often mirror a curious and entrepreneurship-like approach - with many developers taking the "I'm going to build my own thing to solve this problem" route. But as developers' interests grow around a particular area, the general level of knowledge rises, tooling improves and abstractions move up so that it becomes easier to tackle a particular problem.

If we look at the most optimal way to fine tune models, it's often well-suited to a chain of tasks, and this can be easily represented graphically with low-/no-code tooling. As such, we can expect low-/no-code and generative AI to work hand in hand in facilitating everyone's productivity.

Federated learning will be a key player in the future of AI, especially in a privacy-challenged world

While it's true that we might be done with the traditional learning set when it comes to AI, there are still many untapped sources of data. Anything we do on our devices syncs to a server to either train LLMs or fine tune LLMs to a specific usage.

That's where federated learning comes in. With the recent popularity of generative AI, there has been more buzz around adopting a decentralized approach to training AI models - AKA federated learning.

By having the ability to secure training models and support privacy-sensitive applications, federated learning will be a critical player in unlocking the future of AI, while addressing crucial concerns around data privacy and security.

Investments in WebAssembly will continue to grow

While getting less attention next to the popular and trending large language models, WebAssembly (WASM) is one to look out for. WASM is still evolving and has experienced recent advancements with companies continuing to invest in the future of this technology alongside the Bytecode Alliance.

WASM is making its ways into databases, cloud platforms, edge devices, various SaaS technologies and more. Today, there are new technologies being developed to increase WASM interoperability, to create even more value around it. That said, as WASM continues to mature and more companies invest in this technology, we will see a usage increase both in and out of the browser, especially as new runtimes and compilers are developed.

Fine-tuning LLMs with Low-code/No-code and Federated Learning Will Lead to More AI-based Innovation in 2024

2024 is poised to be a transformative year for the tech landscape, with noteworthy trends shaping the future of innovation. Developers, recognizing the indispensability of LLMs, are honing their skills to adjust these models for applications, ushering in a new era of personalized AI solutions. I look forward to the future of even more modernized cloud-based and edge applications and developer-led creative outcomes with AI in their toolset.

##

ABOUT THE AUTHOR

Laurent Doguin 

Laurent Doguin is the Director of Developer Relations and Strategy at Couchbase, the cloud database platform company. Previously, he was a Developer Advocate at Couchbase where he focused on helping Java developers. Prior to Couchbase, Laurent held developer roles at Clever Cloud and Nuxeo.

Published Thursday, November 30, 2023 10:02 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<November 2023>
SuMoTuWeThFrSa
2930311234
567891011
12131415161718
19202122232425
262728293012
3456789