Microsoft Reaches Out to Developers at its Build Conference

Microsoft revealed interesting news during this week’s Build developer conference in Seattle, Washington. Among the key announcements: a pair of mixed reality enterprise apps for the HoloLens; a partnership with DJI to bring Microsoft’s AI and machine learning tech to commercial drones; a preview launch of deep learning acceleration platform Project Brainwave; prototype hardware designed for the meeting room of the future; and Project Kinect for Azure, which provides developers with the opportunity to experiment with a package of sensors and Microsoft’s next-generation depth camera.

During the Day 1 keynote, Microsoft introduced two new enterprise apps — Remote Assist and Layout — which will be available May 22. According to TechCrunch, Remote Assist “is the company’s realization of a use case that has long been a core promise of AR in the workplace: hands-free telepresence that lets the other person see what you’re seeing. This sort of screen sharing of the real world can allow a worker in a manufacturing facility to ping a specialist and get quick annotations and advice about tackling a particular problem.”

Microsoft_Remote_Assist_Layout

Layout “basically allows HoloLens users to drop 3D objects into rooms based on the geometry of the space. This is essentially similar to what a lot of the ARKit and ARCore apps for your phone have done with furniture retailers for sizing stuff in the context of your own home. With Layout’s enterprise skew, there’s more of a focus on designing spaces with CAD models.”

For more details on both apps, including videos, visit the Windows Blog.

Microsoft announced a partnership with Chinese drone manufacturer DJI to integrate Microsoft’s machine learning tech with commercial drones. DJI plans to offer a SDK for Windows to encourage “developers to build native apps to control DJI drones,” notes TechCrunch.

DJI_Mavic_Pro_Microsoft

DJI also revealed, “that Azure is now its preferred cloud computing partner and that it will use the platform to analyze video data, for example. The two companies also plan to offer new commercial drone solutions using Azure IoT Edge and related AI technologies for verticals like agriculture, construction and public safety.”

For more info on the Microsoft-DJI partnership, check out the press release.

Project Brainwave is Microsoft’s new “platform for running deep learning models in its Azure cloud and on the edge in real time,” explains TechCrunch.

While competitors are turning to custom chips, Microsoft is relying on field-programmable gate arrays (FPGAs, pictured below). “Microsoft argues that FPGAs give it more flexibility than designing custom chips and that the performance it achieves on standard Intel Stratix FPGAs is at least comparable to that of custom chips.”

Microsoft_FPGA_Project_Brainwave

Project Brainwave features three layers: “a high-performance distributed architecture; a hardware deep neural networking engine that has been synthesized onto the FPGAs; and a compiler and runtime for deploying the pre-trained models.”

The Microsoft Blog has a great overview: Real-Time AI: Microsoft Announces Preview of Project Brainwave.

At Build, Microsoft also demonstrated what it imagines as the meeting room of the future, which “starts with a 360-degree camera and microphone array that can detect anyone in a meeting room, greet them, and even transcribe exactly what they say in a meeting regardless of language,” details The Verge. “Microsoft has been working on translation features for Skype for years, and the meeting room of the future includes this technology.”

Microsoft is using AI tools to “act on what meeting participants say. If someone says ‘I’ll follow up with you next week,’ then they’ll get a notification in Microsoft Teams, Microsoft’s Slack competitor, to actually act on that promise.” Microsoft’s upcoming Surface Hub displays could be a good fit for the company’s vision of a future meeting room.

With Project Kinect for Azure, Microsoft is offering developers a package of sensors with its next-gen depth camera, a move that it hopes will lead to new products involving ambient intelligence.

Microsoft_Project_Kinect

“Project Kinect for Azure includes its Time of Flight sensor, and a slew of other sensors, in a small, power-efficient form factor,” reports VentureBeat. “The first part of the name refers to fully articulated hand tracking and high fidelity spatial mapping, while the second part references Azure AI services such as Machine Learning, Cognitive Services, and IoT Edge.”

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.