Secure AI Data Centers at Scale: Next-Gen DGX SuperPOD Opens Era of Cloud-Native Supercomputing

secure-ai-data-centers-at-scale:-next-gen-dgx-superpod-opens-era-of-cloud-native-supercomputing

As enterprises increase the electric power of AI and knowledge science to every developer, IT demands to deliver seamless, scalable accessibility to supercomputing with cloud-like simplicity and security.

At GTC21, we introduced the most recent NVIDIA DGX SuperPOD, which presents business, IT and their buyers a system for securing and scaling AI throughout the company, with the required software program to manage it as very well as a white-glove providers encounter to support operationalize it.

Fixing AI Problems of Just about every Size, at Substantial Scale

Due to the fact its introduction, DGX SuperPOD has enabled enterprises to scale their improvement on infrastructure that can deal with complications of a size and complexity that had been beforehand unsolvable in a realistic amount of time. It is AI infrastructure crafted and managed the way NVIDIA does its possess.

As AI gets infused into just about each and every aspect of present day organization, the need to have to deliver pretty much limitless access to computational resources powering growth has been scaling exponentially. This escalation in need is exemplified by enterprise-significant applications like natural language processing, recommender systems and medical investigate.

Companies normally tap into the electricity of DGX SuperPOD in two methods. Some use it to solve large, monolithic issues such as conversational AI, the place the computational energy of an whole DGX SuperPOD is brought to bear to accelerate the schooling of advanced pure language processing models.

Some others use DGX SuperPOD to company an complete business, offering multiple teams obtain to the procedure to assistance fluctuating needs throughout a huge variety of initiatives. In this mode, organization IT is normally acting as a support company, handling this AI infrastructure-as-a-support, with many people (possibly even adversarial types) who will need and expect finish isolation of just about every other’s work and knowledge.

DGX SuperPOD with BlueField DPU

More and more, companies want to deliver the environment of large-effectiveness AI supercomputing into an operational method where numerous developers can be certain their work is secure and isolated like it is in cloud. And the place IT can control the atmosphere a great deal like a non-public cloud, with the means to produce methods to jobs, correct-sized to the job, in a safe, multi-tenant natural environment.

This is identified as cloud-native supercomputing and it’s enabled by NVIDIA BlueField-two DPUs, which convey accelerated, software-defined facts center networking, storage, safety and management expert services to AI infrastructure.

With a details processing device optimized for organization deployment and 200 Gbps network connectivity, enterprises get condition-of-the-art, accelerated, thoroughly programmable networking that implements zero have confidence in safety to safeguard against breaches, and isolate buyers and facts, with bare-metal functionality.

Every single DGX SuperPOD now has this ability with the integration of two NVIDIA BlueField-2 DPUs in just about every DGX A100 node in it. IT directors can use the offload, speed up and isolate capabilities of NVIDIA BlueField DPUs to employ safe multi-tenancy for shared AI infrastructure devoid of impacting the AI efficiency of the DGX SuperPOD.

Infrastructure Management with Base Command Supervisor

Each week, NVIDIA manages countless numbers of AI workloads executed on our internal DGX SATURNV infrastructure, which includes over two,000 DGX programs. To date, we have operate more than 1.two million work on it supporting about two,500 builders throughout extra than 200 teams. We have also been producing point out-of-the-art infrastructure administration software program that guarantees just about every NVIDIA developer is totally effective as they perform their exploration and develop our autonomous devices technological innovation, robotics, simulations and far more.

The software package supports all this operate, simplifies and streamlines administration, and allows our IT staff watch wellness, utilization, effectiveness and additional. We’re including this exact same program, termed NVIDIA Foundation Command Manager, to DGX SuperPOD so companies can run their environments the way we do. We’ll consistently enhance Foundation Command Manager, offering the newest improvements to prospects immediately.

White-Glove Expert services

Deploying AI infrastructure is more than just setting up servers and storage in data middle racks. When a organization decides to scale AI, they want a hand-in-glove working experience that guides them from design and style to deployment to operationalization, devoid of burdening their IT staff to determine out how to run it, after the “keys” are handed above.

With DGX SuperPOD White Glove Companies, consumers take pleasure in a whole lifecycle solutions knowledge which is backed by tested experience from install to operations. Prospects profit from pre-shipping performance accredited on NVIDIA’s have acceptance cluster, which validates the deployed procedure is functioning at specification ahead of it is handed off.

White Glove Services also include things like a focused multidisciplinary NVIDIA crew that addresses everything from installation to infrastructure management to workflow to addressing performance-impacting bottlenecks and optimizations. The providers are designed to give IT leaders peace of brain and self-confidence as they entrust their organization to DGX SuperPOD.

DGX SuperPOD at GTC21

To find out a lot more about DGX SuperPOD and how you can consolidate AI infrastructure and centralize enhancement throughout your organization, test out our session offered by Charlie Boyle, vice president and common manager of DGX Programs, who will deal with our DGX SuperPOD news and more in two independent periods at GTC:

Sign-up for GTC, which operates by means of April 16, for cost-free.

Learn extra:

Leave a comment

Your email address will not be published.


*