A KubeCon Q&A with Briana Frank, Vice President of
Product & Design for IBM Cloud
1. How has the developer role
evolved in the last two years?
As
more enterprises adopt AI technologies, we've seen a shift in the role of
developers. For one, more enterprises are offloading some of the more
operational tasks that developers were responsible for to AI solutions. offloading the operational tasks is
allowing developers to be more entrepreneurial and inventing new ways to
harness the power of AI to delight users. At the same time, many developers are
shifting to become "prompt engineers" - helping to guide AI and LLMs to produce
more accurate outputs. There is a huge opportunity here for developers to not
only become more familiar with the AI they're using but help their enterprises
use AI more efficiently.
2. What are 2-3 skills that
are essential to developers today to prepare for the rise of Gen AI and
cloud technologies?
For
organizations, Gen AI is creating numerous possibilities, but it's also shined
a light on the new skills that developers need to adopt and execute tasks with
this technology. One of the most important skills is understanding the balance
of how to integrate Gen AI into existing applications, while also understanding
how to select the appropriate new tools and models for your organization.
Another
crucial skill for developers that is being driven by the rise in Gen AI adoption
is the need to understand the ethical implications around AI. If AI is not
deployed responsibly and transparently, it could have real-world consequences -
especially in sensitive, safety-critical areas and within regulated industries.
Developers need solutions that can help them manage risk, enhance
transparency, and prepare for compliance with future AI-related regulations.
3. How are developers driving
the next wave of Generative AI with RAG and open source technologies?
Generative AI has great promise, but in some ways,
it can be flawed - such as AI chatbots producing inconsistent results due to
the large stores of data they pull from. The next wave of generative AI
requires fine-tuning of foundational models, which Retrieval-Augmented
Generation (RAG) and open-source technologies are supporting. RAG, for example,
has emerged as a promising solution to ground large language models (LLMs) on
the most accurate, up-to-date information, allowing them to produce increasingly
accurate results and enabling greater generative AI outcomes for organizations
across industries using LLMs.
InstructLab
is an example of a revolutionary open source technology that puts LLM
development into the hands of the open-source developer community, allowing
them to collectively contribute new skills and knowledge to any LLM. Developers
can rapidly iterate and merge skill contributions together, enabling one base
model to be improved by the entire community's contributions and in turn,
produces more consistent and accurate results.
4. Why are more developers
adopting serverless and low code technologies to support AI initiatives?
AI initiatives and solutions have brought on
a daunting influx of data that developers are tasked with managing, especially
when they must manage the underlying infrastructure as well. Some developer
teams are spending too much time worrying about their surrounding
infrastructure when deploying AI applications on Kubernetes, taking valuable
time away from where their focus should be: driving business outcomes and
innovation through these AI applications. One way to think about this would be
like driving a car - you want to be able to drive your car without worrying
about the engine under the hood. Serverless and low-code technology support AI initiatives
by making sure that the engine will continue to run smoothly no matter what.
With IBM Cloud Code Engine serverless
computing, IBM deploys, manages and autoscales our clients' Kubernetes cluster.
This allows for an increase in productivity, while decreasing time to deployment.
Developers have access to an infrastructure where they can run containers in
the cloud without worrying about ownership or management of their environments.
Overall, serverless and low code technologies help combat the challenges
developers face when it comes to Kubernetes and managing AI workloads.
5. What are some concerns that
developers should be aware of regarding data management in AI
technologies, especially when it comes to international data exchange?
The
rise in data privacy regulations - both in the U.S. and abroad - have created a
hyper-focused lens on security and compliance when it comes to data management.
This focus will only continue as international laws continue to evolve and
security risks become more sophisticated. This is especially true for
developers working in highly regulated industries, such as financial services
and healthcare.
In
response to the evolving regulatory landscape, federal banking agencies have enacted
rigorous requirements on data retention. Developers need solutions that ensure
that data retention is unalterable at the storage layer to safeguard it from
any tampering or manipulation. We
offer a solution called Cloud Object Storage that addresses this issue head on.
We also offer clients the IBM Security and Compliance Center, which is an
integrated solutions suite that has an "always on" approach to security and
compliance. It implements controls for secure data and workload deployment so
developers can ensure all their security and compliance goals are being met.
6. What are some 2025
predictions for developer and enterprise AI trends?
One
trend that we're already starting to see impact developers are new compliance
and security regulations related to AI and data management. We expect these
regulations to grow in the years to come. Enterprises need to prepare for these
potential regulations now by adopting solutions with the right technology and
controls in place so that compliance doesn't slow down your development
velocity.
Another
trend we can anticipate seeing is that AI is going to be more accessible to a
broader pool of IT talent. No longer are you expected be a developer to train a
Large Language Model (LLM). Open source projects like InstructLab are enabling
contributed updates to existing LLMs in an accessible way, allowing more
individuals to contribute to AI projects even if they don't have a deep
skillset in AI/ML technologies.
Lastly,
we are seeing more enterprises bring
on platform engineers to support their evolving AI needs. Platform engineers
are responsible for building and maintaining the platforms for software
development, which could include anything from administering deployable
architectures with set configurations, managing deployment, and reporting the
success of their configurations and application environments.
To learn more about Kubernetes and the cloud
native ecosystem, join us at KubeCon + CloudNativeCon North
America, in Salt Lake City, Utah, on November 12-15,
2024.
##
Briana Frank, VP Product & Design
IBM Cloud at IBM
Briana leads the Product Management and
Design organizations within IBM Cloud. Briana strives to make the world work
better for her clients by creating intuitive cloud user experiences through
product management best practices, design thinking and operational rigor.
Briana is an entrepreneur with a proven track record of building high growth
award winning products by listening to the client's needs and following through
with experiences that delight users.
Briana is also a strong advocate for diversity and inclusion in the tech
industry.
In addition to her work at IBM, Briana is also an active member of the tech
community and a patented inventor. Briana volunteers her time by mentoring
small business owners in underrepresented groups. She frequently speaks at
industry events and has published numerous articles on the topics of digital
transformation and cloud computing.