Google Ironwood TPU is Made for Inference and ‘Thinking’ AI

Google has debuted a new accelerator chip, Ironwood, a tensor processing unit designed specifically for inference — the ability of AI to predict things. Ironwood will power Google Cloud’s AI Hypercomputer, which runs the company’s Gemini models and is gearing up for the next generation of artificial intelligence workloads. Google’s TPUs are similar to the accelerator GPUs sold by Nvidia, but unlike the GPUs they’re designed for AI and geared toward speeding neural network tasks and mathematical operations. Google says when deployed at scale Ironwood is more than 24 times more powerful than the world’s fastest supercomputer. Continue reading Google Ironwood TPU is Made for Inference and ‘Thinking’ AI

Google Details Network Challenges, Seeks Academic Feedback

In an unprecedented move, Google revealed the details of how it developed and improved software-defined networking (SDN). In a paper presented at the ACM SIGCOMM 2015 conference in London, Google described the steps taken over a ten-year period, moving from third party vendor switches in 2004 to, a year later, building its own hardware and shuttling data among servers in its own data centers. The company is describing its network in part to share its experiences and seek assistance from the academic community. Continue reading Google Details Network Challenges, Seeks Academic Feedback