Cerebras launch open-source GPT-models that don’t need ‘thousands of GPUs’ to run

You can trust VideoGamer. Our team of gaming experts spend hours testing and reviewing the latest games, to ensure you're reading the most comprehensive guide possible. Rest assured, all imagery and advice is unique and original. Check out how we test and review games here

It seems the buzz around AI, particularly generative language models, is here to stay. Cerebras, a California based start-up, have just launched an open-source language model similar to ChatGPT.

The Silicon Valley start-up have released seven different AI models, all of which are open-source, with a range of parameters between 111m and 13bn.

The release is said to “train large-scale models” and “open-source the results with permissive licenses”, which promises to develop and build the open-source AI community.

The range in parameters exhibited by the Cerebras models encourages a range of uses, though the most exciting factor is the lack of necessity for “hundreds or thousands of GPUs” that typical GPT-models demand.

This is a particularly relevant considering the recent news that “thousands of Nvidia GPUs built ChatGPT.” Graphics cards have long since been associated with less than innocent purposes – crypto mining, for example, is something that has no benefits to society or the environment. This open-source workaround lowers the entry bar for AI users and developers alike.

According to Reuters, these smaller models can instead be ‘deployed on phones or smart speakers while the bigger ones run on PCs or servers.’

Cerebras’ lower parameters are encouraging a welcome shift away from the Nvidia market dominance that has recently seen their stocks soar.

About the Author

Amaar Chowdhury

Amaar is a gaming journalist with an interest in covering the industry's corporations. Aside from that, he has a hankering interest in retro games that few people care about anymore.

More News