Microsoft Announces Azure-Hosted OpenAI Supercomputer

At Microsoft’s Build 2020 developer conference, the company debuted a supercomputer built in collaboration with, and exclusively for, OpenAI on Azure. It’s the result of an agreement whereby Microsoft would invest $1 billion in OpenAI to develop new technologies for Microsoft Azure and extend AI capabilities. OpenAI agreed to license some of its IP to Microsoft, which would then sell to partners as well as train and run AI models on Azure. Microsoft stated that the supercomputer is the fifth most powerful in the world.

VentureBeat reports that Microsoft’s claim is based on the TOP500, “a project that benchmarks and details the 500 top-performing supercomputers.” That list ranks Microsoft’s supercomputer “behind the China National Supercomputer Center’s Tianhe-2A and ahead of the Texas Advanced Computing Center’s Frontera,” which means it “can perform somewhere between 38.7 and 100.7 quadrillion floating point operations per second (i.e., petaflops) at peak.”

OpenAI’s co-founders and supporters, among them “Greg Brockman, chief scientist Ilya Sutskever, Elon Musk, Reid Hoffman, and former Y Combinator president Sam Altman,” believe that “immense computational horsepower is a necessary step on the road to AGI [artificial general intelligence], or AI that can learn any task a human can.”

The new supercomputer — which “contains over 285,000 processor cores, 10,000 graphics cards, and 400 gigabits per second of connectivity for each graphics card server” — is “designed to train single massive AI models,” that learn from “ingesting billions of pages of text” from a variety of sources. Nvidia’s natural language processing (NLP) model, for example, “contains 8.3 billion parameters, or configurable variables internal to the model whose values are used in making predictions.”

OpenAI chief executive Altman noted that, “Microsoft was able to build” what OpenAI imagined was their “dream system.” Although it isn’t clear whether the new supercomputer can “achieve anything close to AGI,” Brockman stated that OpenAI “expects to spend the whole of Microsoft’s $1 billion investment by 2025 building a system that can run ‘a human brain-sized AI model’.”

Engadget reports that, “while we’ve seen many AI implementations focused on single tasks, like recognizing specific objects in images or translating languages, a new wave of research focuses on massive models that can perform multiple tasks at once.” That’s the aim of the Azure-hosted OpenAI supercomputer which, “as Microsoft notes … can include moderating game streams or potentially generating code after exploring GitHub.”

Engadget adds that, “realistically, these large-scale models can actually make AI a lot more useful for consumers and developers alike.”

Related:
Microsoft’s Quantum-Computing Services Attract New Customers, The Wall Street Journal, 5/19/20