July 27, 2017
Qualcomm, whose chips are in 40 percent of all smartphones, has revealed its strategy for streamlining AI tasks, by developing a software development kit (SDK) dubbed Neural Processing Engine. The SDK will help developers revamp their apps for AI tasks on Qualcomm’s Snapdragon 600 and Snapdragon 800 processors. The company first announced the SDK a year ago. Qualcomm’s tactics differ from ARM and Microsoft, which are designing new chips, and Facebook and Google, which hope to reduce the computing power needed to run AI apps.
The Verge reports that, since last September, Qualcomm has been “working with a few partners on developing the SDK, and today it’s opening it up to be used by all.”
“Any developer big or small that has already invested in deep learning — meaning they have access to data and trained AI models — they are the target audience,” said Qualcomm head of AI and machine learning Gary Brotman. “It’s simple to use. We abstract everything under the hood so you don’t have to get your hands dirty.”
Qualcomm reports that the SDK is not simply intended to “optimize AI on mobile devices, but also in cars, drones, VR headsets, and smart home products.”
The SDK works with popular “frameworks for developing AI systems, including Caffe, Caffe2, and Google’s TensorFlow,” and can be used in a variety of ways, depending on what the developer wants to focus on; for example, “to optimize for battery life or processing speed, for example, they can draw on compute resources from different parts of the chip — eg, the CPU, GPU, or DST.”
Moving forward, Qualcomm will likely have to do more as, said Brotman, “a tidal wave of AI workloads … [create] more demand for compute.”
“When we’re baking something into silicon, that’s a very deliberate bet for us, and it doesn’t come easy,” he added. “Longer term, though, is there going to be a need for dedicated neural computing? I think that’s going to be the case, and the question is just, when do we place that bet.”