Samsung First to Design Commercial Semiconductor with AI

Samsung is using Synopsys’ DSO.ai tool to design some of its next-gen Exynos mobile processors for 5G and AI, which will be used in smartphones including its own and other devices. Synopsys chair and co-chief executive Aart de Geus said this is the first example of a “real commercial processor design with AI.” Google, IBM and Nvidia are among the other companies that have discussed designing chips with AI. Synopsys, which works with dozens of companies, also has years of expertise in creating advanced designs to train an AI algorithm. Continue reading Samsung First to Design Commercial Semiconductor with AI

Nvidia Cuts Video-Conferencing Bandwidth by Factor of Ten

Last month Nvidia launched Maxine, a software development kit containing technology the company claims will cut the bandwidth requirements of video-conferencing software by a factor of ten. A neural network creates a compressed version of a person’s face which, when sent across the network, is decompressed by a second neural network. The software can also make helpful corrections to the image, such as rotating a face to look straight forward or replacing it with a digital avatar. Nvidia is now waiting for software developers to productize the technology. Continue reading Nvidia Cuts Video-Conferencing Bandwidth by Factor of Ten

Nvidia Debuts GeForce RTX Chip Series With Lower Latency

Nvidia debuted its 28-billion transistor Ampere-based 30 Series graphics chips for PC gamers, ideal for Microsoft and Sony’s next-generation consoles to unveil by the holidays. The 30 Series GeForce RTX chips (available September 17) are comprised of the RTX 3070 ($500), 3080 ($700), and 3090 ($1,500), with second generation RTX (real-time ray tracing graphics). According to chief executive Jensen Huang, there are “hundreds of RTX games” in development, joining “Minecraft,” “Control” and “Wolfenstein: Youngblood,” which already feature RTX. Continue reading Nvidia Debuts GeForce RTX Chip Series With Lower Latency

Nvidia Debuts Next-Gen Gaming with Ray-Tracing, AI at CES

At Nvidia’s CES 2019 press conference, founder/chief executive Jensen Huang was enthused about gaming. “Usually I also focus on AI and self-driving cars,” he said. “We have a lot of announcements about that. But today it’s all about gaming.” One big announcement was the company’s new GeForce RTX 2060, which is based on Turing architecture and is enabled by both ray-tracing and artificial intelligence. The RTX 2060, priced at $349, will be available January 15 “from every major OEM, system builder and graphics card partner.” Continue reading Nvidia Debuts Next-Gen Gaming with Ray-Tracing, AI at CES

Nvidia Ray-Tracing Technology a Quantum Leap in Rendering

At SIGGRAPH 2018, Nvidia debuted its new Turing architecture featuring ray tracing, a kind of rendering, for professional and consumer graphics cards. Considered the Holy Grail by many industry pros, ray tracing works by modeling light in real time as it intersects with objects. Ray tracing is ideal for creating photorealistic lighting and VFX. Up until now, ray tracing has not been possible to do because it requires an immense amount of expensive computing power, but Nvidia’s professional Turing card costs $10,000. Continue reading Nvidia Ray-Tracing Technology a Quantum Leap in Rendering

Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing

Nvidia unveiled new Turing architecture during a keynote at SIGGRAPH 2018 as well as three new Quadro RTx workstation graphics cards aimed at professionals. Nvidia dubs the Turing architecture as its “greatest leap since the invention of the CUDA GPU in 2006.” The RTx chips are the first to use the company’s ray tracing rendering method, which results in more realistic imagery. Also at SIGGRAPH, Porsche showed off car designs accomplished with Epic Games’ Unreal engine and Nvidia’s RTx chips. Continue reading Nvidia Quadro RTx Chips Offer AI and Real-Time Ray Tracing