Nvidia Integrates Computer Graphics and Artificial Intelligence

At this week’s SIGGRAPH 2017 conference in Los Angeles, Nvidia showed off a variety of technologies connecting graphics and artificial intelligence, delivering 10 research papers relevant to the company’s developers, including an AI-empowered method to create realistic facial animations. The company also showed off its Isaac robots, which vet AI algorithms inside its Project Holodeck virtual environment. By doing so, the robots will be able to learn inside a virtual space for collaboration, minimizing the potential of causing problems in the real world.

VentureBeat reports that, among Nvidia’s 550,000 developers, “about half are in games, while the rest are in high-performance computing, robotics and AI.”


“If you look at our history in graphics, we took that into high-performance computing and took that into a dominant position in deep learning and AI,” said Nvidia VP of developer marketing Greg Estes. “Now we are closing that loop and bringing AI into graphics. Our strategy is to lead with research and break new ground, then we take that lead in research … into software development kits for developers.”

In another effort to help VR content creation, interactive rendering and video editing, Nvidia is improving workflows via “external Titan X or Quadro graphics cards through an external graphics processing unit (eGPU) chassis,” and releasing a new performance driver for Titan X hardware to improve performance of Autodesk Maya and Adobe Premiere Pro.

The company will introduce its Optix 5.0 SDK on the Nvidia DGX AI workstation, to provide “the rendering capability of 150 standard central processing unit (CPU) servers” at the fraction of the cost a CPU-based system, with new AI and animation features.

In research, Nvidia showed how it’s using AI technology to animate realistic human faces and simulate how light interacts with surfaces, and releasing its VRWorks 360 Video SDK, which offers live 360-degree stereo stitching.

Nvidia vice president Zvi Greenstein reports that the live-production and live-event industries will benefit, as well as “production studios, camera makers and app developers” that can “integrate 360 degree, stereo stitching SDK into their existing workflow for live and post production.” Z CAM’s professional 360-degree VR camera will be the first to fully integrate the VRWorks SDK.

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.