Nvidia’s DGX GH200 could build ‘giant’ AI more powerful than ChatGPT and GPT-4

You can trust VideoGamer. Our team of gaming experts spend hours testing and reviewing the latest games, to ensure you're reading the most comprehensive guide possible. Rest assured, all imagery and advice is unique and original. Check out how we test and review games here

After recently being revealed by Jensen Huang at this year’s Computex – Nvidia’s GH200 supercomputer could be the technological advancement pushing us closer to seeing generative AI models even more powerful than GPT-4.

Until now, Nvidia had been offering the A100 and H100 GPUs primarily to train AI models, though recent advancements could see a GPU released which will be “providing developers with nearly 500x more memory to build giant models.”

The Nvidia DGX GH200 is composed of 256 Nvidia Grace Hopper Superchips connected by their new NVLink Switch System. This allows each chip to act as a single GPU, significantly decreasing load times between hardware components. The GH100 can then increase bandwidth by 7x, and decreases power consumption significantly, according to the official Nvidia Data Center product reveal.

As mentioned above – the GH100’s significantly increased performance and technical possibilities are going to be directed at building ‘giant’ AI models. Perhaps in anticipation of Microsoft’s upcoming AI supercomputer, or Google’s PaLM 2, the GH200 will ready itself for battle with the world’s most powerful AI systems.

Nvidia’s GH200 press release has Mark Lohmeyer, VP of Compute at Google Cloud saying that Nvidia GH200 GPUs will be used to solve for Google Cloud’s “generative AI initiatives,” while Girish Bablani, the Corporate VP of Azure Infrastructure at Microsoft has said that it will be used for “training large AI models … at accelerated speeds.”

Nvidia and OpenAI have a strong relationship, and both can attribute recent successes to each other. The development of the GH200 AI supercomputer could be the enhancement that catalyses the training of the GPT-5 language model, which is tipped for a release in coming years. It’s also possible that with the 144TB memory and 1 exaflop of performance – we could see GPT-5 finally achieve artificial general intelligence.

Either way – there’s one thing we know for sure. Nvidia’s leaps and bounds in AI hardware and technology are a force to be reckoned with – and we could be seeing AI models more powerful than GPT-4 in the near future.

About the Author

Amaar Chowdhury

Amaar is a gaming journalist with an interest in covering the industry's corporations. Aside from that, he has a hankering interest in retro games that few people care about anymore.