Cerebras Is Moving into Mainstream with New AI Data Centers

Cerebras Systems was founded 10 years ago on the belief that there would be a shortage of processors powerful enough to drive enterprise AI computing at scale. Its solution, the Cerebras Wafer-Scale Engine, is integrated into Cerebras’ CS-3 systems, which will power six new data centers launching this year that the company says will make it “the world’s number one provider of high-speed inference and the largest domestic high speed inference cloud.” Cerebras notes the new facilities will collectively serve over 40 million Llama 70B tokens per second to clients that now include Hugging Face and financial intelligence firm AlphaSense. Continue reading Cerebras Is Moving into Mainstream with New AI Data Centers

Chipmakers Intel, Nvidia Now Compete with Their Customers

Companies such as Intel and Nvidia have long dominated the design and manufacture of semiconductor chips, but they are now facing competition from their own customers. Amazon, Google and Microsoft, all of which have seen strong growth in cloud computing, are looking to create their own chips to ensure better performance and lower costs. Amazon, for example, debuted a chip intended to speed up AI algorithms. Traditional chip manufacturers are creating specialized processors to retain their long-time customers. Continue reading Chipmakers Intel, Nvidia Now Compete with Their Customers