Microsoft Claims Brainwave Pushes Bing’s AI 10 Times Faster

Microsoft’s Brainwave system, which is “specialized hardware for AI computation,” was able to “get more than 10 times faster performance for a machine learning model that powers functionality of its Bing search engine,” reports VentureBeat. Brainwave is designed to run trained neural networks as quickly as possible with minimal latency and with the goal of providing “roughly real-time artificial intelligence predictions for applications like new Bing features.” This news was shared with a handful of Bing updates announced Monday.

For Microsoft, “this announcement is another step toward … making acceleration available for its cloud customers to run their own dedicated hardware-powered AI models,” reports VentureBeat.


Other updates to Bing include “support for defining less-used words when users hover their mouse pointers over them and offering multiple answers to how-to questions,” both of which are possible due to the power of Brainwave.

How does it work? “Microsoft is using field-programmable gate arrays (FPGAs) from Intel to power its AI computation. FPGAs are essentially blank canvases that allow developers to deploy a wide variety of different circuits by sending fresh software,” writes VentureBeat, also noting that this approach “provides an interesting combination of programmability and performance, since the resulting circuits are optimized for particular applications (like AI computation), but can be changed without building a new chip.”

Microsoft was able to create faster models and build even more complex AI systems.

“For example, Bing’s Turing Prototype 1 model became 10 times more complex — compared to a version built for a CPU — as a result of the added computation capacity that came from using Brainwave. And while the Brainwave version is more complicated, Microsoft can also get results back from that model over 10 times faster,” reports VentureBeat.

Microsoft’s next move is to make Brainwave accessible to those outside the company.

“The FPGAs already power parts of its Cognitive Services portfolio of intelligent APIs that allow people to embed intelligent capabilities into their applications without AI expertise. The company is also planning to make the Brainwave-powered text comprehension capabilities it discussed today available to enterprise customers through its Bing for Business service,” notes VentureBeat, adding that “further down the road, it’s likely we’ll see a Brainwave service available through Microsoft Azure so customers can deploy their own models on top of Microsoft’s FPGAs.”