Oracle announced the first zettascale cloud computing clusters accelerated by the NVIDIA Blackwell platform. Oracle Cloud Infrastructure (OCI) is now taking orders for the largest AI supercomputer in the cloud-available with up to 131,072 NVIDIA Blackwell GPUs.
"We have one of the broadest AI infrastructure offerings and are
supporting customers that are running some of the most demanding AI
workloads in the cloud," said Mahesh Thiagarajan, executive vice
president, Oracle Cloud Infrastructure. "With Oracle's distributed
cloud, customers have the flexibility to deploy cloud and AI services
wherever they choose while preserving the highest levels of data and AI
sovereignty."
World's first Zettascale computing cluster
OCI is now taking orders for the largest AI supercomputer in the
cloud-available with up to 131,072 NVIDIA Blackwell GPUs-delivering an
unprecedented 2.4 zettaFLOPS of peak performance. The maximum scale of OCI Supercluster
offers more than three times as many GPUs as the Frontier supercomputer
and more than six times that of other hyperscalers. OCI Supercluster
includes OCI Compute Bare Metal, ultra-low latency RoCEv2 with
ConnectX-7 NICs and ConnectX-8 SuperNICs or NVIDIA Quantum-2
InfiniBand-based networks, and a choice of HPC storage.
OCI Superclusters are orderable with OCI Compute powered by either
NVIDIA H100 or H200 Tensor Core GPUs or NVIDIA Blackwell GPUs. OCI
Superclusters with H100 GPUs can scale up to 16,384 GPUs with up to 65
ExaFLOPS of performance and 13Pb/s of aggregated network throughput. OCI
Superclusters with H200 GPUs will scale to 65,536 GPUs with up to 260
ExaFLOPS of performance and 52Pb/s of aggregated network throughput and
will be available later this year. OCI Superclusters with NVIDIA GB200 NVL72
liquid-cooled bare-metal instances will use NVLink and NVLink Switch to
enable up to 72 Blackwell GPUs to communicate with each other at an
aggregate bandwidth of 129.6 TB/s in a single NVLink domain. NVIDIA
Blackwell GPUs, available in the first half of 2025, with
fifth-generation NVLink, NVLink Switch, and cluster networking will
enable seamless GPU-GPU communication in a single cluster.
"As businesses, researchers and nations race to innovate using AI,
access to powerful computing clusters and AI software is critical," said
Ian Buck, vice president of Hyperscale and High Performance Computing,
NVIDIA. "NVIDIA's full-stack AI computing platform on Oracle's broadly
distributed cloud will deliver AI compute capabilities at unprecedented
scale to advance AI efforts globally and help organizations everywhere
accelerate research, development and deployment."
Customers such as WideLabs and Zoom are leveraging OCI's
high-performing AI infrastructure with powerful security and sovereignty
controls.
WideLabs trains one of the largest Portuguese LLMs on OCI
WideLabs, an applied AI startup in Brazil, is training one of
Brazil's largest LLMs, Amazonia IA, on OCI. They developed bAIgrapher,
an application that uses its LLM to generate biographical content based
on data collected from patients with Alzheimer's disease to help them
preserve important memories.
WideLabs uses the Oracle Cloud São Paulo Region to run its AI
workloads, ensuring that sensitive data remains within country borders.
This enables WideLabs to adhere to Brazilian AI sovereignty requirements
by being able to control where its AI technology is deployed and
operated. WideLabs uses OCI AI infrastructure with NVIDIA H100 GPUs to
train its LLMs, as well as Oracle Kubernetes Engine
to provision, manage, and operate GPU-accelerated containers across an
OCI Supercluster consisting of OCI Compute connected with OCI's
RMDA-based cluster networking.
"OCI AI infrastructure offers us the most efficiency for training and
running our LLMs," said Nelson Leoni, CEO, WideLabs. "OCI's scale and
flexibility is invaluable as we continue to innovate in the healthcare
space and other key sectors."
Zoom uses OCI's sovereignty capabilities for its generative AI assistant
Zoom, a leading AI-first collaboration platform, is using OCI to
provide inference for Zoom AI Companion, the company's AI personal
assistant available at no additional cost. Zoom AI Companion helps users
draft emails and chat messages, summarize meetings and chat threads,
generate ideas during brainstorms with colleagues, and more. OCI's data
and AI sovereignty capabilities will help Zoom keep customer data
locally in region and support AI sovereignty requirements in Saudi
Arabia, where OCI's solution is being rolled out initially.
"Zoom AI Companion is revolutionizing the way organizations work,
with cutting-edge generative AI capabilities available at no additional
cost with customers' paid accounts," said Bo Yan, head of AI, Zoom. "By
harnessing OCI's AI inference capabilities, Zoom is able to deliver
accurate results at low latency, empowering users to collaborate
seamlessly, communicate effortlessly, and boost productivity,
efficiency, and potential like never before."