StabilityAI launches StableLM open-source alternatives to ChatGPT

StabilityAI announced the launch of StableLM, a suite of open-source large language models.

The large language model sector continues to swell as StabilityAI, maker of the popular image-generation tool Stable Diffusion, has launched a suite of open-source language model tools.

Dubbed, StableLM, the publicly-available alpha versions of the suite currently contain models featuring three and seven billion parameters with 15, 30, and 65-billion parameter models noted as “in progress” and a 175-billion model planned for future development.

By comparison, GPT-4 has a parameter count estimated at one trillion, six times higher than its predecessor GPT-3.

The parameter count may not be an even measure of LLM efficacy, however, as Stability AI noted in its blog post announcing the launch of StableLM:

“StableLM is trained on a new experimental dataset built on The Pile, but three times larger with 1.5 trillion tokens of content […] The richness of this dataset gives StableLM surprisingly high performance in conversational and coding tasks, despite its small size of 3 to 7 billion parameters.”

It’s unclear at this time exactly how robust the StableLM models are. The StabilityAI team noted on the organization’s Github page that more information about the LMs capabilities would be forthcoming, including model specifications and training settings.

Related: Microsoft is developing its own AI chip to power ChatGPT

Provided the models perform well enough in testing, the arrival of a powerful open-source alternative to OpenAI’s ChatGPT could prove interesting for the cryptocurrency trading world.

As Cointelegraph reported, people are building advanced trading bots on top of the GPT API and new variants that incorporate third-party tool access, such as BabyAGI and AutoGPT.

The addition of open-source models into the mix could be a boon for tech-savvy traders who don’t want to pay OpenAI’s access premiums.

Those interested can test out a live interface for the 7B-parameter StableLM model hosted on HuggingFace. However, as of the time of this article’s publishing, our attempts to do so found the website overwhelmed or at capacity.

Read Entire Article


Add a comment