Microsoft creates ChatGPT supercomputer for hundreds of millions of dollars
According to a Bloomberg report, Microsoft has spent hundreds of millions of dollars to build a supercomputer for the OpenAI ChatGPT chatbot. In its blog posts, the company explains how it has created a powerful Azure AI infrastructure that OpenAI uses and how its systems are becoming even more reliable.
Here's What We Know
To build the supercomputer, Microsoft combined thousands of Nvidia GPUs on its Azure platform. In turn, this allowed OpenAI to train more and more powerful models and "unlocked the power of AI" in tools such as ChatGPT and Bing.
Microsoft's vice president of artificial intelligence and cloud Scott Guthrie said that the company spent hundreds of millions of dollars on the project. And while this may seem like a drop in the bucket for such a huge company, Microsoft has recently extended its multi-billion dollar investment in OpenAI, which undoubtedly demonstrates the company's willingness to actively develop this area.
Microsoft is already working to make Azure's AI capabilities even more powerful with the launch of its new virtual machines that use Nvidia H100 and A100 Tensor Core GPUs and Quantum-2 InfiniBand networking, a project that both companies announced last year. According to Microsoft, this should allow OpenAI and other companies that rely on Azure to train larger and more complex AI models.
Source: The Verge