Facebook Using Self-Supervised Models to Build AI Systems

Facebook debuted Learning from Videos, a project designed to learn audio, images and text from publicly available Facebook videos to improve its core AI systems. By culling data from hundreds of languages and countries, said Facebook, the project will also help to enable “entirely new experiences.” Learning from Videos, which began in 2020, has also helped to improve recommendations in Instagram Reels. Facebook, Google and others are focused on self-supervised techniques rather than labeled datasets to improve AI. Continue reading Facebook Using Self-Supervised Models to Build AI Systems

Nvidia Cuts Video-Conferencing Bandwidth by Factor of Ten

Last month Nvidia launched Maxine, a software development kit containing technology the company claims will cut the bandwidth requirements of video-conferencing software by a factor of ten. A neural network creates a compressed version of a person’s face which, when sent across the network, is decompressed by a second neural network. The software can also make helpful corrections to the image, such as rotating a face to look straight forward or replacing it with a digital avatar. Nvidia is now waiting for software developers to productize the technology. Continue reading Nvidia Cuts Video-Conferencing Bandwidth by Factor of Ten

Nvidia Debuts New Version of A100 GPU for Supercomputers

At the beginning of its SC20 supercomputing conference, Nvidia unveiled its 80GB version of the A100 GPU (graphics processing unit) based on its Ampere graphics architecture and aimed at AI and graphics for supercomputing. The chip is intended to enable faster real-time data analysis for business and government applications. This new version doubles the memory of the predecessor, debuted six months ago. Nvidia executive Paresh Kharya noted that 90 percent of the world’s data was created in the last two years. Continue reading Nvidia Debuts New Version of A100 GPU for Supercomputers

AMD Acquires Xilinx: Opens Door for 5G, Data Center Chips

Advanced Micro Devices (AMD) agreed to pay $35 billion in stock to acquire Xilinx, which will enable it to diversify into chips for 5G wireless communications and automotive electronics. The company, which has some of the strongest sales in its 51-year history, has traditionally been Intel’s rival for computer chips. With Xilnix, AMD could also provide components for data centers and compete with Nvidia in that space. The all-stock deal is still topped by Nvidia’s plan to purchase UK chipmaker Arm for $40 billion. Continue reading AMD Acquires Xilinx: Opens Door for 5G, Data Center Chips

With Arm Purchase, Nvidia May Dominate AI Edge Computing

Moore’s Law — Intel co-founder Gordon Moore’s prediction that the number of transistors on a chip doubles about every two years — has been the foundation of the semiconductor industry. But, as the industry nears the limits of circuitry and physics of electronics, it’s being replaced by another one: that silicon chips powering AI more than double in power every two years, due to hardware and software. As Moore’s Law was the foundation for improvements in computers, this new law will power the Internet of Things. With its $40 billion acquisition of Arm Holdings, Nvidia could be positioned for a new type of evolution.  Continue reading With Arm Purchase, Nvidia May Dominate AI Edge Computing

Nvidia Acquisition of SoftBank’s Arm Brings Rewards, Risks

Nvidia agreed to pay $40 billion — $21.5 billion in stock, $12 billion in cash — for SoftBank’s Arm division, a chip designer based in the United Kingdom. Nvidia will pay $2 billion on signing, and SoftBank will also receive $5 billion in cash or stock should Arm’s performance meet specific standards. Arm employees will receive $1.5 billion in Nvidia stock. This will be the biggest semiconductor industry deal since SoftBank paid $31.4+ billion to purchase Arm in 2016. The deal will also increase competition between Nvidia and Intel. Continue reading Nvidia Acquisition of SoftBank’s Arm Brings Rewards, Risks

Microsoft Confirms the Debut of $299 Xbox Series S Console

Following a series of leaks, Microsoft has now confirmed the existence of another new game console, the Xbox Series S, which it dubs the smallest Xbox ever (60 percent smaller than the Series X). The company said the “slim, white console” will debut “soon,” provide “next-gen performance” and be priced at $299. Reports suggest that Series S and Series X will be available for purchase on November 10 and hint that many Series S features are similar to those of Series X, but with less disc drive and less powerful CPU and GPU. Microsoft stated that the Xbox Series X “will be four times more powerful than its predecessor, the Xbox One X.” Continue reading Microsoft Confirms the Debut of $299 Xbox Series S Console

Nvidia Purchase of Arm Signals Inflection Point in Computing

If Nvidia acquires Arm Ltd. in the next few weeks, which many experts predict will happen, the company may be in the position to dominate the next computing ecosystem. Jefferies semiconductor analyst Mark Lipacis notes that, the computer industry goes through a “strategic inflection point” every 15 years, with research showing that dominant players in each era account for 80 percent of the profits. Different ecosystems are the result of “multi-pronged” strategy by those companies that come out on top. Continue reading Nvidia Purchase of Arm Signals Inflection Point in Computing

Nvidia Debuts GeForce RTX Chip Series With Lower Latency

Nvidia debuted its 28-billion transistor Ampere-based 30 Series graphics chips for PC gamers, ideal for Microsoft and Sony’s next-generation consoles to unveil by the holidays. The 30 Series GeForce RTX chips (available September 17) are comprised of the RTX 3070 ($500), 3080 ($700), and 3090 ($1,500), with second generation RTX (real-time ray tracing graphics). According to chief executive Jensen Huang, there are “hundreds of RTX games” in development, joining “Minecraft,” “Control” and “Wolfenstein: Youngblood,” which already feature RTX. Continue reading Nvidia Debuts GeForce RTX Chip Series With Lower Latency

Sony Reveals Details on PlayStation 5 Consoles, New Games

Sony debuted two versions of its PlayStation 5 game console as well as new games, in advance of the holiday season. The PS5 Digital Edition, the second version, omits the Blu-ray Disc drive, and its download-only feature could eventually impact Amazon, GameStop, Walmart and other retailers. The Digital Edition also sports a sleeker design and, potentially, a lower price. New games include the latest “Spider-Man” and “Gran Turismo” titles and an enhanced version of Take-Two Interactive Software’s “Grand Theft Auto V.” Continue reading Sony Reveals Details on PlayStation 5 Consoles, New Games

Nvidia A100: Powerful New Chipset Created for Advancing AI

Nvidia unveiled its A100 artificial intelligence chip, which houses 54 billion transistors and can execute 5 petaflops of performance, about 20 times more than the company’s previous Volta chip. Chief executive Jensen Huang, who revealed it during his Nvidia GTC keynote address, dubbed it “the ultimate instrument for advancing AI.” The original March 24 introduction was postponed due to the COVID-19 pandemic. Nvidia also unveiled the DGX A100 system, the third generation of Nvidia’s AI DGX platform, which uses the new chips. The DGX A100 is now shipping. Continue reading Nvidia A100: Powerful New Chipset Created for Advancing AI

Intel to Unveil Experimental Neuromorphic Computing System

Intel will debut Pohoiki Springs, an experimental research system for neuromorphic computing that simulates the way human brains work and computes more quickly and with less energy. It will first be made available, via the cloud, to the Intel Neuromorphic Research Community, which includes about a dozen companies (such as Accenture and Airbus), academic researchers and government labs. Intel and Cornell University jointly published a paper on the Loihi chip’s ability to learn and recognize 10 hazardous materials from smell. Continue reading Intel to Unveil Experimental Neuromorphic Computing System

HPA Tech Retreat: The Latest Workflows for Virtual Production

The HPA Tech Retreat kicked off with an ambitious daylong demo that highlighted innovations in content creation, management and distribution technology and workflows. Supersession chair Joachim Zell, VP technology for EFILM walked the audience through numerous elements of an HDR production: filming, editing and finishing two scenes that provided the final chapters for a short film. The process, much of which involved workflows in the cloud, featured multiple cameras, on-set management and collaboration platforms, editorial, dailies and digital intermediate color grading systems, as well as online mastering and distribution platforms. Continue reading HPA Tech Retreat: The Latest Workflows for Virtual Production

AMD vs. Intel: The Computing Wars Ramp Up in Las Vegas

CES is not a computing show, but this year’s edition felt silicon-centric thanks to major announcements from Intel and AMD. Intel revealed more details about its next CPU, Tiger Lake, that boasts improved performance on graphics and AI. The company also offered a glimpse of its first discrete GPU. But the show arguably belonged to AMD, which continued its year-long renaissance with a keynote unveiling mobile CPUs, a new midrange GPU, and the world’s fastest workstation processor. Continue reading AMD vs. Intel: The Computing Wars Ramp Up in Las Vegas

Apple Inks Deal With Imagination For Ray-Tracing Chip Tech

Apple inked a multi-year licensing agreement with U.K. company Imagination Technologies, giving it “wider range” access to that company’s IP including a new ray-tracing technology. Observers believe the move signals that Apple plans on adding ray tracing to its chips “in the foreseeable future.” Ray tracing is a graphics technology that enables imagery to be created with real-world lighting, reflections and shadows, creating a much more photorealistic result. Nvidia first brought ray tracing to PC GPUs in August 2018. Continue reading Apple Inks Deal With Imagination For Ray-Tracing Chip Tech

Page 1 of 212