AI Firm Cerebras Systems Raises $250 Million in New Funding

Cerebras Systems has raised $250 million in a series F funding round, bringing the Sunnyvale, California-based firm’s value to more than $4 billion, according to the company. Cerebras makes what is described as the world’s fastest chip, the Wafer Scale Engine 2 (WSE-2). Investment from Alpha Wave Ventures and Abu Dhabi Growth Fund will allow Cerebras to make the CS-2 AI accelerator compute system designed around the turbo-charged WSE-2 chips available to new customers globally in what co-founder and CEO Andrew Feldman describes as the “democratization of AI.” Continue reading AI Firm Cerebras Systems Raises $250 Million in New Funding

Cerebras Chip Tech to Advance Neural Networks, AI Models

Deep learning requires a complicated neural network composed of computers wired together into clusters at data centers, with cross-chip communication using a lot of energy and slowing down the process. Cerebras has a different approach. Instead of making chips by printing dozens of them onto a large silicon wafer and then cutting them out and wiring them to each other, it is making the largest computer chip in the world, the size of a dinner plate. Texas Instruments tried this approach in the 1960s but ran into problems. Continue reading Cerebras Chip Tech to Advance Neural Networks, AI Models

Cerebras Introduces AI Processor with 2.6 Trillion Transistors

Cerebras Systems introduced its Wafer Scale Engine 2 (WSE-2) processor, which touts a record-breaking 2.6 trillion transistors and 850,000 AI-optimized cores, what the company describes as “the largest chip ever built.” Established by SeaMicro founder Andrew Feldman, Cerebras makes a massive chip out of a single wafer, unlike the typical process of slicing it into hundreds of separate chips. This is the company’s second chip that is built out of an entire wafer, wherein the pieces of the chip, dubbed cores, interconnect to enable the transistors to work together as one. Continue reading Cerebras Introduces AI Processor with 2.6 Trillion Transistors