Meta Tests New AI Accelerator Chip Designed with Broadcom

Meta Platforms has reportedly begun “a small deployment” of its first in-house chip designed for AI training. The accelerator chip is engineered around the open-standard RISC-V architecture. TSMC produced the working samples now being tested. The goal is to create purpose-specific chips that are more efficient than Nvidia’s general purpose GPUs, enjoying the cost-savings that would come with wide use and reducing reliance on outside chip suppliers in a tight market. If the tests go well, Meta plans to scale up production for expanded use by 2026. Details of the new chip’s specifications remain unknown at this time.

“The push to develop in-house chips is part of a long-term plan at Meta to bring down its mammoth infrastructure costs as the company places expensive bets on AI tools to drive growth,” Reuters writes, calling the move “a key milestone as it moves to design more of its own custom silicon.”

Meta has forecast this year’s AI capital expenses will total about $65 billion — more than half its projected $114 billion to $119 billion in total 2025 expenses, according to Reuters.

The new bespoke chip is the latest in the Meta Training and Inference Accelerator (MTIA) series developed with Broadcom, the first of which appeared in May 2023. A Gen 2 version used in the social media giant’s recommendation engines appeared in April 2024.

“Meta was one of the first companies to build its RISC-V-based chips for AI inference several years ago to cut costs and reduce reliance on Nvidia,” writes Tom’s Hardware, noting that “the company went one step further” in designing its new accelerator for AI training.

Reuters reports that, “the goal for the training chip is to start with recommendation systems and later use it for generative AI products like chatbot Meta AI.”

Meta became “one of Nvidia’s biggest customers after placing orders for billions of dollars’ worth of GPUs in 2022,” writes Engadget. That purchase “was a pivot for Meta after it bailed on a previous in-house inference silicon that failed a small-scale test deployment — much like the one its doing now for the training chip.”

Speaking at the Morgan Stanley Tech, Media & Telecom conference last week, Meta Chief Product Officer Chris Cox “described Meta’s chip development efforts as ‘kind of a walk, crawl, run situation’ so far,” according to Reuters.

Meta isn’t the only company stepping out of its comfort zone to develop AI chips. Amazon, Apple, Google and Microsoft are also reaching for that brass ring.

No Comments Yet

You can be the first to comment!

Leave a comment

You must be logged in to post a comment.