Microsoft Pairs Azure Cloud Platform, Graphcore AI Chips

Microsoft will begin providing customers of its Azure cloud platform with chips made by U.K. startup Graphcore, with the goal of speeding up the computations for artificial intelligence projects. Graphcore, founded in Bristol in 2016, has attracted several hundred million dollars in investment and the attention of many AI researchers. Microsoft invested in Graphcore last December, with the hope of making its cloud services more compelling. Graphcore’s chips have not previously been available publicly.

Wired reports that “unlike most chips used for AI, Graphcore’s processors were designed from scratch to support the calculations that help machines to recognize faces, understand speech, parse language, drive cars, and train robots.” Microsoft and Graphcore “published benchmarks that suggest the chip matches or exceeds the performance of the top AI chips from Nvidia and Google using algorithms written for those rival platforms.”

The two companies also said that they “were able to train a popular AI model for language processing, called BERT, at rates matching those of any other existing hardware.” Microsoft currently uses Graphcore’s chips for “internal AI research projects involving natural language processing.”

Graphcore chief executive Nigel Toon said his company began working with Microsoft Research Cambridge a year after its launch, and that, “his company’s chips are especially well-suited to tasks that involve very large AI models or temporal data.” Among the companies using the dual solution are “Citadel, which will use the chips to analyze financial data, and Qwant, a European search engine that wants the hardware to run an image-recognition algorithm known as ResNext.”

Graphcore also “created a software framework called Poplar, which allows existing AI programs to be ported to its hardware.” Prominent AI researchers invested in Graphcore include DeepMind co-founder Demis Hassabis, University of Cambridge professor Zoubin Ghahramani, who heads Uber’s AI lab, and UC Berkeley professor Pieter Abbeel, who specializes in AI and robotics.

Moor Insights analyst Karl Freund, who tracks the AI chip market, said: “Good performance in both training and inference is something they’ve always said they would do, but it is really, really hard.” He added that, although “the chip may well be superior to existing hardware for some applications … [its] benchmarks are not eye-popping enough to lure companies and researchers away from the hardware and software they are already comfortable using.”

Wired notes that, “Google’s TensorFlow AI software framework has become the de facto standard for AI programs in recent years, and it was written specifically for Nvidia and Google chips.” Nvidia is due to release a new AI chip next year. Amazon and Facebook are also working on their own chips, as are many startups, some of them “optimized for specific applications such as autonomous driving or surveillance cameras.”