By
Paula ParisiJune 4, 2024
Nvidia President and CEO Jensen Huang said the company will be upgrading its AI accelerators annually, with the Blackwell Ultra processor coming in 2025 and a next-generation platform called Rubin that is still in development planned for 2026. Rubin AI will utilize a type of high-bandwidth memory called HBM4 that addresses a bottleneck that has stifled the production of AI accelerators. Huang shared the news from Taiwan, where he delivered a keynote at the Computex trade show. Nvidia Inference Microservices were another focus, allowing AI applications to be deployed in minutes instead of weeks, Huang said. Read more
By
Paula ParisiJune 4, 2024
At Computex Taipei this week, AMD revealed its AMD Ryzen AI 300 Series third generation of AI-enabled mobile processors for next-generation laptops. It joins Intel’s upcoming Lunar Lake and the Snapdragon X platform from Qualcomm among the chips vying for a place in the exploding market for artificial intelligence processing, an area dominated by Nvidia. However, with AI PCs and laptops just hitting the market that field is somewhat in play. The Ryzen AI 300s are among those that will be used to power laptops equipped with Microsoft Copilot+ AI. At Computex, AMD also unveiled its Ryzen 9000 Series processors for desktop PCs. Read more
By
Paula ParisiJune 4, 2024
A year after its announcement, Fable is launching Showrunner, a platform that lets anyone make TV-style animated content by writing prompts that are turned into shows by generative AI. The San Francisco company run by CEO Edward Saatchi with recruits from Oculus, Pixar and various AI startups is launching 10 shows that let users make their own episodes “from their couch,” waiting only minutes to see the finished result, according to Saatchi, who says a 15-word prompt is enough to generate 10- to 20-minute episodes. Saatchi is hoping Fable’s shows can garner an audience by self-publishing on Amazon Prime. Read more
By
Paula ParisiJune 3, 2024
Big Tech players have joined forces to develop a new industry standard to advance high-speed and low latency communication among data centers by coordinating component development. AMD, Broadcom, Cisco, Google, Hewlett Packard Enterprise (HPE), Intel, Meta Platforms and Microsoft are backing the Ultra Accelerator Link (UALink) promoter group. The group plans to define and establish an open industry standard that will enable AI accelerators to communicate more effectively. The UALink aims to create a pathway for system OEMs, IT professionals and system integrators to connect and scale their AI-connected data centers. Read more
By
Paula ParisiJune 3, 2024
French startup Mistral AI has released its first large language model for coding. Codestral gives developers looking for a code-native AI tool an option to Meta’s Code Llama, Microsoft’s GitHub Copilot and Amazon Q. Fluent in 80 programming languages — including Python, C++ and JavaScript — Codestral can complete code, write tests, and augment partial code “using a fill-in-the-middle mechanism,” while reducing “the risk of errors and bugs,” according to the company. The new LLM is described as open, but its license prohibits commercial use of both Codestral and its outputs. Read more