The computing space first announced in March is now open for general availability. The same type of hardware underpinned ChatGPT.
NVIDIA’s DGX Cloud infrastructure, which lets organizations lease space on supercomputing hardware suitable for training generative AI models, is now generally available. First announced in March, the $36,999 per instance per month service is in competition with NVIDIA’s own $200,000 DGX server. It runs on Oracle Cloud infrastructure and on NVIDIA hardware located in the US and the United Kingdom.
Jump to:
What does NVIDIA DGX Cloud do?
DGX Cloud is a remote-access version of NVIDIA’s hardware, including the thousands of NVIDIA GPUs online on Oracle Cloud Infrastructure.
The DGX AI system is the hardware that ChatGPT trained on in the first place, so NVIDIA has the right pedigree for organizations that want to spin up their own generative AI models. When training ChatGPT, Microsoft linked together tens of thousands of NVIDIA’s A100 graphics chips to get the power it needed; now, NVIDIA wants to make the process much easier — essentially, providing AI training as a service.
Pharmaceutical companies, manufacturers and finance institutions using natural language processing and AI chatbots are among DGX Cloud’s existing customers, NVIDIA said.
Organizations interested in DGX Cloud can apply to sign up.
SEE: ChatGPT is now available as an Android app (TechRepublic).
What makes the NVIDIA DGX Cloud for AI platform work?
Key to the success of the DGX Cloud for AI platform is a high-performance, low-latency fabric that allows workloads to scale across clusters of interconnected systems, enabling multiple instances to perform as if they were all part of one GPU.
The subscription price of $36,999 per instance per month allows an organization to rent space on eight NVIDIA 80GB Tensor Core GPUs for 640GB of GPU memory per node — the supercomputer array — all accessible in a web browser. Customers can manage and monitor the training workloads through the NVIDIA Base Command Platform software dashboard.
“The DGX Cloud user interface (NVIDIA Base Command Platform) lets enterprises rapidly execute and manage model development without having to worry about the underlying infrastructure,” Tony Paikeday, senior director, DGX Platforms at NVIDIA, noted in an email to TechRepublic.
From there, organizations can use NVIDIA AI Enterprise, the software portion of the platform. It provides a library of over 100 end-to-end AI frameworks and pre-trained models, making the development and deployment of production AI relatively straightforward.
Paikeday pointed out that customers already using DGX Cloud have typically chosen it because traditional computing doesn’t provide as many dedicated resources.
Customers want “computational scale and network fabric interconnect that lets them parallelize these very large workloads over many co-resident compute instances operating as a single massive supercomputer,” he said.
How access to AI computing is changing
As generative AI becomes more common, organizations are responding to the demand for changes in the way AI is used, from a publicly trained powerhouse like GPT-4 to private instances in which organizations can use their own data and develop their own proprietary use cases. Access to the heavy-duty computing power needed will change accordingly.
“The availability of NVIDIA DGX Cloud provides a new pool of AI supercomputing resources, with nearly instantaneous access,” said Pat Moorhead, chief analyst at Moor Insights & Strategy, in a press release from NVIDIA.
“Generative AI has made the rapid adoption of AI a business imperative for leading companies in every industry, driving many enterprises to seek more accelerated computing infrastructure,” he said.
“We are at the iPhone moment of AI. Startups are racing to build disruptive products and business models, and incumbents are looking to respond,” said Jensen Huang, founder and CEO of NVIDIA, at the time of the original announcement in March. “DGX Cloud gives customers instant access to NVIDIA AI supercomputing in global-scale clouds.”