Meta In-House Chip Designs Include Processing for AI, Video

Meta Platforms has shared additional details on its next generation of AI infrastructure. The company has designed two custom silicon chips, including one for training and running AI models and eventually powering metaverse functions like virtual reality and augmented reality. Another chip is tailored to optimize video processing. Meta publicly discussed its internal chip development last week ahead of a Thursday virtual event on AI infrastructure. The company also showcased an AI-optimized data center design and talked about phase two of deployment of its 16,000 GPU supercomputer for AI research. Continue reading Meta In-House Chip Designs Include Processing for AI, Video

Facebook to Develop Live Video Filtering Chips for Faster AI

Facebook has used Intel CPUs for many of its artificial intelligence services, but the company is changing course to adapt to the pressing need to better filter live video content. At the Viva Technology industry conference in Paris, Facebook chief AI scientist Yann LeCun stated that the company plans to make its own chips for filtering video content, because more conventional methods suck up too much energy and compute power. Last month, Bloomberg reported that the company is building its own semiconductors. Continue reading Facebook to Develop Live Video Filtering Chips for Faster AI

Google Offers Its AI Chips to All Comers via Cloud Computing

Google, which created tensor processing units (TPUs) for its artificial intelligence systems some years ago, will now make those computer chips available to other companies via its cloud computing service. Google is currently focusing on computer vision technology, which allows computers to recognize objects; Lyft used these chips for its driverless car project. Amazon is also building its own AI chips for use with the Alexa-powered Echo devices to shave seconds off its response time and potentially increase sales. Continue reading Google Offers Its AI Chips to All Comers via Cloud Computing

Microsoft Speeds Up AI with New Programmable FPGA Chips

In 2012, Microsoft chief executive Steve Ballmer and computer chip researcher Doug Burger believed they had found the future of computing: chips that could be programmed for specific tasks, dubbed field programmable gate arrays (FPGAs). Project Catapult, as it was called, was intended to shift the underlying technology of all Microsoft servers in that direction. FPGAs now form the basis of Bing. Soon, the specialized chips will be capable of artificial intelligence at a tremendous speed — 23 milliseconds versus four seconds. Continue reading Microsoft Speeds Up AI with New Programmable FPGA Chips

Google Develops its Own Chip to Speed Up Machine Learning

Google has just built its own chip as part of its efforts to speed up artificial intelligence developments. The company revealed that this is just the first of many chips it plans to develop and build. At the same time, an increasing number of businesses are migrating to the cloud, lessening the need for servers that rely on chips to function. That’s led some to believe that Google and other Internet titans that follow its lead will impact the future of the chip industry, particularly such stalwarts as Intel and Nvidia. Continue reading Google Develops its Own Chip to Speed Up Machine Learning