Video Gamer is reader-supported. When you buy through links on our site, we may earn an affiliate commission. Prices subject to change. Learn more
To cut to the chase – ‘GPT’ stands for generative pre-trained transformer. But that doesn’t really mean much to the ordinary punter – so what does it mean?
ChatGPT is OpenAI’s now viral chatbot. It generates conversational responses depending on what prompts you enter, and it builds these responses using years of data training that the GPT model has undergone.
It’s important to note that ChatGPT is different to the GPT-4 model. They are not interchangeable. ChatGPT is a chatbot interface, GPT-4 is the artificial intelligence that powers the chatbot, among many other AIs – including Microsoft’s soon to release Copilot.
So if ChatGPT is the chatbot powered by the GPT-4 language model, what does generative pre-trained transformer mean?
Jasper AI
Copy AI
Originality AI detector
What does generative pre-trained transformer mean?
GPT, or generative pre-trained transformer, is a name reserved for OpenAI’s family large language models (LLMs). LLMs are used in every chatbot you’ve seen so far, including Google Bard, but only OpenAI has developed GPT.
The pre-trained part refers to the extensive amount of pre-training a GPT model will go through. During this process, it is given a huge amount of textual information, usually sources from across literature, the internet and much more.
During training, it learns to predict what comes next in a string of words. Thanks to this that it is then able to generate a lot of text, even when only given a small input.
The word ‘Transformer’ actually refers to an AI technology developed by Google. The Transformers Google developed reduced the number of possibilities an AI considers before making a choice.
Rather than going through exhaustive steps to compare something like a word’s position within a sentence, the Transformer would only perform a small and constant number of steps, each chosen empirically.
So for example it would be trained to look for the word ‘river’ in any sentence that contained the word ‘bank’, to help it determine if the ‘bank’ referred to a river or a financial institution.
OpenAI made use of this technology in its GPT models.
Does Google bard use GPT?
No, despite creating the transformer technology in GPT models, Google uses its own large language model called LaMDA
Is GPT-3 the same as GPT-4?
No, both are built from similar technology and train in similar ways but GPT-4 is the latest interation, being far more powerful than GPT-3