AI Development Accelerates, Chips Speed Model Training

At VentureBeat’s Transform 2019 conference in San Francisco, Intel vice president/chief technology officer of AI products Amir Khosrowshahi and IoT general manager Jonathan Ballon discussed the evolution of AI adoption. A Lopez Research survey revealed that 86 percent of companies believe AI will be strategic to their business, but only 36 percent of them report having made “meaningful progress.” Khosrowshahi pointed out that more companies than ever have access to the necessary data, tools and training.

VentureBeat reports that Khosrowshahi’s point of view is validated by a Gartner report in January showing that “AI implementation grew a whopping 270 percent in the past four years and 37 percent in the past year alone … up from 10 percent in 2015.” Some experts predict that the enterprise AI market will be valued at $6.14 billion by 2022.

“If you’re doing something that’s cloud-based, you’ve got access to vast computing resources, power, and cooling, and all of these things with which you can perform certain tasks,” said Khosrowshahi. “But what we’re finding is that almost half of all of the deployments and half of all the world’s data sits outside of the data center, and so customers are looking for the ability to access that data at the point of origination.”

This reflects a growing interest in so-called edge AI, which “to an extent outpaced hardware, much of which is practically incapable of accomplishing tasks better suited to a data center.” Google’s Tensor Processing Units and Intel’s upcoming Nervana Neural Network Processor (NNP-T 1000) are important to enable speedier training of AI models.

“Processor cooling infrastructure, software frameworks, and so forth have really enabled [these AI models], and it’s kind of an enormous amount of compute,” said Khosrowshahi. “[It’s all about] scaling up processing compute and running all the stuff on specialized hardware infrastructure.” Ballon added though that, despite the tools, “the developer experience isn’t particularly streamlined.”

“When you look at the workflow associated with actually deploying an AI model, the degree that the hardware architecture is abstracted from data scientists [and] application developer[s] [needs to] go to a long way,” he said. “We’re not there yet.”

Still, both executives “believe that hardware innovations have the potential to further democratize powerful AI.” Khosrowshahi is particularly enthused about transistors that rely on multiferroics and topological materials, so-called MESO devices, which “promise to be 10 to 100 times more energy-efficient than current microprocessors, which are largely based on CMOS.”

Optical chips, which require less energy, are also interesting, in part because they are “less susceptible to changes in ambient temperature, electromagnetic fields, and other noise.”

“There are novel materials that we can exploit for the future of … data center computing, and I think this is actually the future,” said Khosrowshahi.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.