IBM Introduces Granite LLMs for Enterprise Code Developers

IBM has released a family of its Granite AI models to the open-source community. The series of decoder-only Granite code models are purpose-built to write computer code for enterprise developers, with training in 116 programming languages. These Granite models range in size from 3 to 34 billion parameters in base model and instruction-tuned variants. They offer a range of uses, from modernizing older code with new languages to optimizing programs for on-device memory constraints, such as might be experienced when conforming for mobile gadgets. In addition to generation, the models can repair and explain code. Continue reading IBM Introduces Granite LLMs for Enterprise Code Developers

Opera Browser Is Experimenting with Local Support for LLMs

Opera has become the first browser to add support for large language models (LLMs). At this point the feature is experimental, and available only on the Opera One Developer browser as part of the AI Feature Drops program. The update offers about 150 LLMs from more than 50 different families, including Meta’s LLaMA, Google’s Gemma, Mixtral and Vicuna. Opera had previously only offered local support for its own Aria AI, a competitor to Microsoft Copilot and OpenAI’s ChatGPT. The local LLMs are being offered for testing as a complimentary addition to Opera’s online Aria service. Continue reading Opera Browser Is Experimenting with Local Support for LLMs

Stability AI Is Offering Paid Membership for Commercial Users

As the pressure ratchets up for AI companies to go beyond the wow factor and make money, Stability AI has formalized three subscription tiers as it seeks to expand commercial use of its open-source, multimodal core models. The Stability AI Membership offerings include a free tier for personal and research (i.e., non-commercial) use, a professional tier that costs $20 a month, and a custom-priced enterprise tier for large outfits. The company says that with the three tiers it is “striking a balance between fostering competitiveness and maintaining openness in AI technologies.” Continue reading Stability AI Is Offering Paid Membership for Commercial Users

Big Tech Firms Propel Hugging Face to $4.5 Billion Valuation

Hugging Face has collected $235 million in an investment round that includes contributions from Amazon, IBM, Google, Nvidia, Salesforce, AMD, Intel and Qualcomm. The New York-based startup creates and distributes open-source tools for artificial intelligence development, carving an AI-centric niche similar to the more general programming approach taken by the Microsoft-owned GitHub. The incoming cash infusion — earmarked for talent recruitment — gives Hugging Face a lofty $4.5 billion valuation that experts say indicates momentum for open source in what has to date been a highly competitive AI sector. Continue reading Big Tech Firms Propel Hugging Face to $4.5 Billion Valuation

The New York Times Looks to Protect IP Content in Era of AI

Newsrooms can potentially benefit greatly from AI language models, but at this early stage they’ve begun laying down boundaries to ensure that rather than having their data coopted to build artificial intelligence by third parties they’ll survive long enough to create models of their own, or license proprietary IP. As industries await regulations from the federal government, The New York Times has proactively updated its terms of service to prohibit data-scraping of its content for machine learning. The move follows a Google policy refresh that expressly states it uses search data to train AI. Continue reading The New York Times Looks to Protect IP Content in Era of AI

Aptos Teams with Microsoft Azure OpenAI on Web3 Solutions

Blockchain startup Aptos Labs will use the Microsoft Azure OpenAI Service to “explore innovative solutions” in blockchain and Web3 for technologies involving artificial intelligence, tokenization and payments. As part of the deal Aptos describes as a “partnership,” the company is launching Aptos Assistant, which will enable natural language prompts, making Web3 applications like smart contracts and decentralized apps more “user-friendly and secure” for “everyday Internet users and organizations” as well as developers. Aptos offers what is known as Layer 1 blockchain, technology designed to facilitate transactions at scale. Continue reading Aptos Teams with Microsoft Azure OpenAI on Web3 Solutions

Meta Sees Double-Digit Growth for the First Time Since 2021

Meta Platforms had a successful Q2, reporting double-digit growth for the first time since Q4 2021. The performance was attributed to a rebound in the digital advertising sector. The good news comes with a warning, as the company says it plans to increase spending on virtual reality and artificial intelligence next year. The parent of Facebook and Instagram reported revenue of just under $32 billion for the period ending June 30, an 11 percent gain over 2022. Advertising contributed a whopping $31.5 billion, growing nearly 12 percent year-over-year. Continue reading Meta Sees Double-Digit Growth for the First Time Since 2021

Meta Unveils Llama 2 LLM with Microsoft as Preferred Partner

This week, Meta Platforms released Llama 2, the next generation of its open-source large language model that is free for research and commercial use. Llama 2’s pretrained and fine-tuned language models are available in sizes ranging from 7 to 70 billion parameters. Meta also named Microsoft Azure its “preferred partner for Llama 2,” offering it through the Azure AI model catalog for use with cloud-native tools that leverage content filtering and safety features. Meta says Llama 2 is “also optimized to run locally on Windows,” providing developers a seamless workflow across enterprise and consumer platforms. Continue reading Meta Unveils Llama 2 LLM with Microsoft as Preferred Partner

Inflection Shares Test Results for Its First AI Language Model

AI-startup Inflection has unveiled a new foundation LLM (large language model) to power its Pi chatbot. Inflection-1 approximates OpenAI’s GPT-3.5 in terms of size and functionality, which puts it on a par with ChatGPT insofar as model training. Inflection claims its LLM exceeds some benchmarks when tested against that competing system, as well as Meta Platforms’ LLaMA, DeepMind’s Chinchilla and Google’s PaLM-540B. Pi is short for Personal Intelligence, and Inflection compiled its LLM with a goal of creating an emotive AI whose conversation provides a reasonable facsimile of empathy and human-like sensibilities. Continue reading Inflection Shares Test Results for Its First AI Language Model

Senators Question Meta Platforms About Recent LLaMA Leak

Meta Platforms CEO Mark Zuckerberg received a letter this week from Senators Richard Blumenthal and Josh Hawley of the Subcommittee on Privacy, Technology & the Law that took the executive to task for an online leak of the company’s LLaMA artificial intelligence system. The 65-billion parameter language model, which is still under development, was open-sourced in February. Available on request through Meta’s GitHub portal, it wound up on 4chan and BitTorrent “making it available to anyone, anywhere in the world, without monitoring or oversight,” the senators wrote. Continue reading Senators Question Meta Platforms About Recent LLaMA Leak

Meta In-House Chip Designs Include Processing for AI, Video

Meta Platforms has shared additional details on its next generation of AI infrastructure. The company has designed two custom silicon chips, including one for training and running AI models and eventually powering metaverse functions like virtual reality and augmented reality. Another chip is tailored to optimize video processing. Meta publicly discussed its internal chip development last week ahead of a Thursday virtual event on AI infrastructure. The company also showcased an AI-optimized data center design and talked about phase two of deployment of its 16,000 GPU supercomputer for AI research. Continue reading Meta In-House Chip Designs Include Processing for AI, Video

Mixed Reactions to ‘Pause’ on AI Models Larger than GPT-4

Respected members of the advanced tech community are going on record opposing the faction calling for a “pause” in large-model artificial intelligence development. Meta Platforms chief AI scientist Yann LeCun and DeepLearning.AI founder and CEO Andrew Ng, formerly at Alphabet where he helped launch Google Brain, were joined this past week by Bill Gates and former Google CEO Eric Schmidt in opposing the proposed six-month halt to development of AI models more advanced than OpenAI’s GPT-4, which is said to train on a trillion parameters — more than 500 times that of GPT-3. Continue reading Mixed Reactions to ‘Pause’ on AI Models Larger than GPT-4

Researchers Developing Open-Source Challenger to ChatGPT

Today’s leading AI chatbots need tremendous computing resources to train, then function, but that isn’t stopping startups from trying to get into the game, some with open-source alternatives. Clearly disadvantaged compared to market leaders like OpenAI, Meta, DeepMind and Anthropic — deep-pocketed, all — a band of independent researchers has coalesced under the name Together. Their aim: to become the first open-source challenger to the likes of ChatGPT. The industry seems undecided as to whether open-source AI is a good thing. Many are worried at the thought of a universally available AI toolkit, and what troublemakers might do with it. Continue reading Researchers Developing Open-Source Challenger to ChatGPT

Google’s PaLM API, MakerSuite Coming to Select Developers

Google is readying an API and other enterprise tools for its Pathways Language Model (PaLM) — a large language model similar to GPT — to encourage developers to create chatbots and other apps using the platform. PaLM is one of Google’s most advanced systems, with the capability to generate text, images, code, video and audio from natural language prompts. Much like OpenAI’s GTP series and the LLaMA family from Meta Platforms, it is suitable for a wide variety of general tasks. To facilitate PaLM’s use for specific tasks, Google is launching the MakerSuite along with the PaLM API. Continue reading Google’s PaLM API, MakerSuite Coming to Select Developers

Meta Says Its LLaMA AI for Researchers Does More with Less

Meta Platforms has unveiled a new generative artificial intelligence language system called LLaMA, which doesn’t chat, but is designed as a research tool the company hopes will help “democratizing access in this important, fast-changing field.” The LLaMA (Large Language Model Meta AI) ranges in size from 7B to 65B parameters. Touted as a “smaller, more performant model,” LLaMA enables those members of the research community that do not “have access to large amounts of infrastructure to study these models,” Meta explains. Training smaller foundation models requires less computing power and resources for testing and validation. Continue reading Meta Says Its LLaMA AI for Researchers Does More with Less