Google Restructures AI Research Units into Google DeepMind

In a move it sees as a force multiplier, Alphabet is consolidating DeepMind and the Brain team from Google Research into a unit called Google DeepMind, uniting the teams responsible for Google Brain with DeepMind, the UK-based artificial intelligence research lab acquired in 2014. Collective accomplishments include AlphaGo, Transformers, WaveNet and AlphaFold, as well as software frameworks like TensorFlow and JAX for expressing, training and deploying large scale ML models. “Combining all this talent into one focused team, backed by the computational resources of Google, will significantly accelerate our progress in AI,” the company announced. Continue reading Google Restructures AI Research Units into Google DeepMind

Nvidia Introduces New Architecture to Power AI Data Centers

Nvidia CEO Jensen Huang announced a host of new AI tech geared toward data centers at the GTC 2022 conference this week. Available in Q3, the H100 Tensor Core GPUs are built on the company’s new Hopper GPU architecture. Huang described the H100 as the next “engine of the world’s AI infrastructures.” Hopper debuts in Nvidia DGX H100 systems designed for enterprise. With data centers, “companies are manufacturing intelligence and operating giant AI factories,” Huang said, speaking from a real-time virtual environment in the firm’s Omniverse 3D simulation platform. Continue reading Nvidia Introduces New Architecture to Power AI Data Centers