Stability AI Debuts Open Source StableLM Foundation Model

Stability AI has released StableLM, an open source language model that will compete with OpenAI’s GPT-4 to create apps like ChatGPT. The Alpha version of StableLM is available in 3 billion and 7 billion parameters, and the company promises 15 billion to 65 billion parameter models to come. “With the launch of the StableLM suite of models, Stability AI is continuing to make foundational AI technology accessible to all,” the London-based company said. The StableLM models can generate text and code to power various downstream applications with appropriate training.

Stability AI is making the StableLM base models available for commercial or research purposes under terms of the Creative Commons BY-SA 4.0 license. Trained 1.5 trillion tokens of content using a new experimental dataset that is three times larger than the open-source base model known as The Pile. Stability AI says in a blog post that it will be releasing full details on the dataset “in due course.”

At 3-7 billion parameters (compared to GPT-3’s 175 billion and a reported 1 trillion for GPT-4) Stability AI says its new mode is “surprisingly high performance in conversational and coding tasks, despite its small size.”

The company is also releasing a set of research models that are instruction fine-tuned. Initially, these fine-tuned models will use a combination of five recent open-source datasets for conversational agents: Alpaca, GPT4All, Dolly, ShareGPT, and HH. These fine-tuned models are intended for research use only and are released under a noncommercial Creative Commons BY-NC-SA 4.0 license.

“Language models will form the backbone of our digital economy, and we want everyone to have a voice in their design,” the company says, emphasizing a commitment to “AI technology that is transparent, accessible and supportive.”

StableLM is available on GitHub. Users can also “test the 7 billion-parameter StableLM base model on Hugging Face and the fine-tuned model on Replicate,” writes Ars Technica, noting that “a dialog-tuned version of StableLM with a similar conversation format as ChatGPT” is also available on Hugging Face.

The past year has seen “Meta, Nvidia and independent groups like the Hugging Face-backed BigScience project” releasing open source models, reports TechCrunch, also pointing out that Stability AI is “burning through cash” and “under pressure to monetize.”

With AI tailored to everything from “art and animation to biomed and generative audio,” TechCrunch says company CEO Emad Mostaque “has hinted at plans to IPO” after a raise of over $100 million in October, when the company was valued at $1 billion.

No Comments Yet

You can be the first to comment!

Leave a comment

You must be logged in to post a comment.