The data centers that make generative AI products like ChatGPT possible will soon reach size limits, according to Microsoft Azure Chief Technology Officer Mark Russinovich, necessitating a new method of connecting multiple data centers together for future generations of the technology.
The most advanced AI models today need to be trained inside a single building where tens (and soon hundreds) of thousands of AI processors, such as Nvidia’s H100s, can be connected so they act as one computer.
But as Microsoft and its rivals compete to build the world’s most powerful AI models, several factors, including America’s aging energy grid, will create a de facto cap on the size of a single data center, which soon could consume multiple gigawatts of power, equivalent to hundreds of thousands of homes.