CircleCI
announced it has implemented a gen2 GPU resource class, leveraging Amazon
Elastic Compute Cloud (Amazon EC2) G5 instances, offering the latest generation
of NVIDIA GPUs, and new images tailored for artificial intelligence/machine
learning (AI/ML) workflows. These under-the-hood enhancements put
cost-effective, powerful resources at developers' fingertips to accelerate AI
innovation. Additionally, the company launched key new features for teams
building Large Language Model (LLM)-powered applications:
inbound webhooks, including support for tools like Hugging Face, and an
integration with evaluation platform
LangSmith. The
company also released a
CircleCI Orb for Amazon
SageMaker to help software teams deploy
and monitor ML models at scale.
As organizations race to build, deploy, and
manage AI-enabled applications on a commercial scale, they're grappling with
how to get started. With its new inbound webhooks and evaluation platform
integrations, CircleCI is redefining what CI/CD tools do to manage the novel
complexity that AI/ML introduces so that teams can confidently go from idea to
innovation.
"AI's infusion into applications increases the
complexity of risk and trust issues with the addition of autonomous software.
Traditional testing tools don't deal with probabilistic or uncertain testing
outcomes," wrote Diego Lo Giudice, Vice President and Principal Analyst at
Forrester. "Because testing AI-enabled applications is a key element of gaining
users' trust in AI-based systems, developing a systematic and industrialized
approach to testing AI-infused applications is now table stakes."
CircleCI's latest features are pivotal in
helping teams manage the complexity of developing AI-powered applications by
providing a structured, automated pipeline spanning from building and testing
to training and monitoring them. Its new inbound webhooks are entirely
adaptable to all sources of change, making it the most change-agnostic CI/CD
tool on the market. This also marks a necessary departure from a version
control-centric approach to now allowing users to trigger pipelines through
various sources. This shift is crucial as AI-powered applications live outside
the repository, with code, data, and the LLM all interacting to drive novel
product experiences for end consumers. Therefore, engineering teams must
rethink how they test, release, and retrain their applications.
The custom Hugging Face integration is a new
trigger to kick off pipelines. Developers can now run automated workflows any
time a model on Hugging Face changes, so engineering teams can be confident
that their application continues to behave as expected. The LangSmith
integration is the first of many evaluation platforms to be supported on
CircleCI, enabling robust testing for non-deterministic outcomes. Testing
AI-enabled applications is new territory for professional software teams, and
these new capabilities from CircleCI will dramatically up-level developer
confidence when building and validating LLM-powered software.
"Software teams are building the next wave of
AI-powered applications that solve specific customer pain points," said Rob
Zuber, CTO at CircleCI. "While many teams find it difficult to get started, at
the end of the day, we're still building software. You already have 95% of the
tools needed to do it. By supporting AI product builders with CircleCI's
comprehensive CI/CD tooling, engineering teams can confidently build upon years
of key learnings while also addressing the novel changes AI introduces."
CircleCI also introduced its new Orb for Amazon
SageMaker to help teams using Amazon SageMaker ship their model to production.
The Orb enables basic deployment to Amazon SageMaker, monitors deployments for
problems, and quickly rolls back endpoints should something go awry. It also
executes different deployment strategies, namely canary and blue/green style
deployments.