Virtualization Technology News and Information
Article
RSS
NVIDIA 2021 Predictions: From AI to 5G, How Global Trends Will Transform Data Centers in 2021

vmblog 2021 prediction series 

Industry executives and experts share their predictions for 2021.  Read them in this 13th annual VMblog.com series exclusive.

From AI to 5G, How Global Trends Will Transform Data Centers in 2021

After COVID-related slowdown in investments, data centers poised to undergo rapid change as businesses adjust to accelerated digitization of information

By Kevin Deierling, Senior Vice President of NVIDIA Networking

nvidia datacenter 

The first thing an executive might ask about the data center is "What could change?" 

The answer: A lot. The entire data center architecture is set to undergo massive change after a year of relatively stagnant investment brought on by the COVID-19 pandemic. In 2021, servers, racks and cooling units will start to look vastly different thanks to GPU- and DPU-accelerated computing, smart NICs and AI-enabled software.

Open networking will become more valuable as the trillions of terabytes of data  generated by consumers and businesses grows exponentially thanks to an increase in IoT devices and to the COVID-19 epidemic that has accelerated online activity, from shopping to remote work.

Here's some of the key trends we expect in 2021:

Accelerating Change in the Data Center

Accelerated applications will be offloaded from CPUs into GPUs, and SmartNICs based on programmable data processing units (DPUs) will accelerate data center infrastructure services such as networking, storage and security. The addition of these GPUs and DPUs will deliver expanded application acceleration to all enterprise workloads and provide an extra layer of security. Virtualization and scalability will be faster, while CPUs will be freed up to run traditional apps faster and offer accelerated services.   

The new data center architecture will leverage software-defined, hardware-accelerated virtualization, which provides manageability, security and flexibility. It will also support containers, which ease adoption and management of AI frameworks.

A DPU is a new class of programmable processor that combines three key elements. A DPU is a system on a chip, or SOC, that combines:

  • An industry standard, high-performance, software programmable, multi-core CPU, typically based on the widely-used Arm architecture, tightly coupled to the other SOC components
  • A high-performance network interface capable of parsing, processing, and efficiently transferring data at line rate to GPUs and CPUs
  • A rich set of flexible and programmable acceleration engines that offload and improve applications performance for AI and Machine Learning, security, telecommunications, and storage, among others.

All these DPU capabilities are critical to enable isolated, bare-metal, cloud-native computing that will define the next generation of cloud-scale computing.

NVIDIA and VMware engineering teams are working together to deliver an end-to-end enterprise platform for AI. This new platform will integrate AI software available from NVIDIA NGC (AI optimized container platform), VMware vSphere, VMware Cloud Foundation and VMware Tanzu making it easier to deploy and manage AI. Every industry from financial services, healthcare and manufacturing will be able to deploy AI workloads using containers and virtual machines on the same platform.

End-user spending on global data center infrastructure is projected to climb to $200 billion in 2021, up 6 percent from 2020, according to the latest forecast from Gartner. GPUs and DPUs will be a part of this data center infrastructure, and will result in significant performance, efficiency, and security improvements.

AI as a Service

Companies that are reluctant to spend time and resources investing in AI, whether for financial reasons or otherwise, will begin turning to third-party providers to achieve rapid time to market. The broad ecosystem of AI companies developing on NVIDIA CUDA framework and AI platforms will become key partners by providing access to software, infrastructure and solutions.   

Transformational 5G

Companies will begin defining what "the edge" is. Autonomous driving is essentially a data center in the car, allowing AI to make instantaneous decisions, while also  sending data back for model training that improves will improve the in car inference decisions. Similarly  with robots in the warehouse and the workplace, there will be inference learning at the edge and training in the core. Just like 4G spawned transformational change in transportation with Lyft and Uber, 5G will bring transformational new capabilities and business opportunities. It won't happen all at once, but you'll start to see the beginnings of companies seeking to take advantage of the confluence of AI, 5G and new computing platforms. The important attribute of these GPU and DPU AI accelerated platforms is that they are fully software defined. This allows businesses to quickly adapt to evolving technologies and rapidly deploy new services and business models.  

Hybrid Cloud

In addition to moving certain workloads into the public cloud, companies are also designing their own private cloud in an on-premises data center, controlling a dedicated private network and virtualized infrastructure. As a result, modern data centers are expected to provide features such as low-latency networking, built-in virtualization and container platforms, or even native support for databases and other advanced applications.  

These software-defined, hardware-accelerated stacks will take advantage of DPUs to accelerate networking, storage, security, and management applications. 

DPUs are an essential element of modern data centers. In this model the data center is the new unit of computing in which CPUs, GPUs and DPUs combine into a single computing unit that's fully programmable, AI-enabled, and can deliver greater levels of security, performance, and efficiency. 

This transforms the entire data center into a massive software-programmable unit of computing that can be provisioned and operated as a service. The modern data center can then be used not only for high performance computing and deep learning but for data analytics, remote workstations, and application virtualization.

"Someday, trillions of computers running AI will create a new internet - the internet-of-things - thousands of times bigger than today's internet-of-people,' says Jensen Huang, NVIDIA's CEO.  

It's already begun.

##

About the Author

Kevin Deierling is Senior Vice President of NVIDIA Networking

Kevin Deierling 

Kevin is an entrepreneur, innovator, and technology executive with a proven track record of creating profitable businesses in highly competitive markets.

Kevin has been a founder or senior executive at five startups that have achieved positive outcomes (3 IPOs, 2 acquisitions). Combining both technical and business expertise, he has variously served as the chief officer of technology, architecture, and marketing of these companies where he led the development of strategy and products across a broad range of disciplines including: networking, security, cloud, Big Data, machine learning, virtualization, storage, smart energy, bio-sensors, and DNA sequencing.

Kevin has over 25 patents in the fields of networking, wireless, security, error correction, video compression, smart energy, bio-electronics, and DNA sequencing technologies.

When not driving new technology, he finds time for fly-fishing, cycling, bee keeping, & organic farming.

Published Monday, December 14, 2020 8:00 AM by David Marshall
Comments
There are no comments for this post.
To post a comment, you must be a registered user. Registration is free and easy! Sign up now!
Calendar
<December 2020>
SuMoTuWeThFrSa
293012345
6789101112
13141516171819
20212223242526
272829303112
3456789