F5 announced BIG-IP Next Cloud-Native Network
Functions (CNFs) deployed on NVIDIA BlueField-3 DPUs, deepening the
companies' technology collaboration.
This solution offers F5's proven network infrastructure capabilities,
such as edge firewall, DNS, and DDoS protection, as lightweight
cloud-native functions accelerated with NVIDIA BlueField-3 DPUs to
deliver optimized performance in Kubernetes environments and support
emerging edge AI use cases.
The F5 Application Delivery and Security Platform
powers a majority of the world's Tier-1 5G, mobile, and fixed line
telco networks. Service providers recognize the challenges of scaling AI
applications across distributed environments, particularly as legacy
infrastructures in the network core often lack the processing power
required to make AI inferencing practical.
F5 CNFs running on NVIDIA DPUs can now be embedded in edge and far edge
infrastructures to optimize computing resources, dramatically reduce
power consumption per Gbps, and limit overall operating expenses.
Further utilizing edge environments to add functionality and AI
capabilities to subscriber services also comes with added security
requirements, which F5 and NVIDIA BlueField technologies deliver
alongside advanced traffic management while minimizing latency.
Deploying CNFs at the edge puts applications closer to users and their
data, promoting data sovereignty, improving user experience, and
reducing costs related to power, space, and cooling. Enabling low
latency remains essential for AI applications and capabilities such as:
-
Immediate decision making, supporting autonomous vehicles and fraud detection.
-
Real-time user interaction, including NLP tools and AR/VR experiences.
-
Continuous monitoring and response, required for healthcare devices and manufacturing robotics.
Including CNFs on BlueField-3 DPUs expands on F5's previously introduced BIG-IP Next for Kubernetes deployed on NVIDIA DPUs. F5 continues to leverage the NVIDIA DOCA software framework
to seamlessly integrate its solutions with NVIDIA BlueField DPUs. This
comprehensive development framework provides F5 with a robust set of
APIs, libraries, and tools to harness the hardware acceleration
capabilities of NVIDIA BlueField DPUs. By utilizing DOCA, F5 achieves
rapid integration and high performance across various networking and
security offloads while maintaining forward and backward compatibility
across generations of BlueField DPUs. Further, accelerating F5 CNFs with
NVIDIA BlueField-3 frees up CPU resources which can be used to run
other applications.
Edge deployments open up key opportunities for service providers,
including distributed N6-LAN capabilities for UPFs, and edge security
services to support Distributed Access Architecture (DAA) and Private
5G. In addition, AI-RAN is gaining momentum, with SoftBank recently showcasing their production environment with NVIDIA.
Unlocking the potential of AI-RAN with NVIDIA and F5
AI-RAN
seeks to transform mobile networks into multi-purpose infrastructures
that maximize resource utilization, create new revenue streams through
hosted AI services, and improve cost efficiency. Enabling mobile
providers to support distributed AI computing with reliable, secure, and
optimized connectivity, AI-RAN strengthens edge infrastructure
capabilities by taking advantage of otherwise dormant processing power.
Together, BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs will accelerate
AI-RAN deployments with streamlined traffic management for both AI and
RAN workloads, as well as provide enhanced firewall and DDoS
protections. Multi-tenancy and tenant isolation for workloads tied to
essential capabilities will be natively integrated into the solution.
With F5 and NVIDIA, mobile providers can intelligently leverage the same
RAN compute infrastructure to power AI offerings alongside existing RAN
services, driving significant cost savings and revenue potential
through enhanced user offerings.