DeepSeek-V3.1 Offered with Improvements in Speed, Context

This week, DeepSeek-V3.1 dropped on Hugging Face. Media outlets immediately began citing benchmark scores that rival proprietary systems from OpenAI and Anthropic for a system that is available via a permissive license, facilitating wide access. The 685-billion parameter Mixture-of-Experts (MoE) model has 37 billion active parameters and is designed for efficiency. It builds on DeepSeek-pioneered processes like multi-head latent attention (MLA) and multi-token prediction (MTP) to optimize inference, enabling high-performance computing on both enterprise servers loaded with H100 GPUs and consumer hardware like a Mac Studio or comparably powered PC. Continue reading DeepSeek-V3.1 Offered with Improvements in Speed, Context

Chinese AI Startup DeepSeek Disrupting the U.S. Tech Sector

Hangzhou-based AI firm DeepSeek is roiling the U.S. tech sector and upending financial markets. The startup has managed to become competitive with Silicon Valley’s deep learning firms despite U.S. sanctions that prevent Chinese technology companies from buying premium chips. DeepSeek has made it into the global top 10 in terms of model performance, and as of this week had the top-ranked free AI assistant at the Apple App Store. DeepSeek’s new R1 model has drawn attention for using less computing power than competing systems, while performing comparably, despite having been developed using older Nvidia chips. Continue reading Chinese AI Startup DeepSeek Disrupting the U.S. Tech Sector