The GPT3, recently launched by the company OpenAI, is a milestone in the history of Artificial Intelligence. It is the most powerful natural language model ever created: it is capable of producing text on any topic, writing simple codes, music.
The GPT3 has 175 billion parameters! Its predecessor, GPT2, which was already impressive, had 1.5 billion parameters.
It is a level of hardware and software unattainable for ordinary businesses. Just to train this network, OpenAI spent about 5 million dollars!
I don’t know if it serves as a reference, but the biggest network I’ve ever trained on my best computer, with GPU, had about 15 layers and about 100 thousand parameters, and it was an enormous job — it was weeks of experimenting with hyperparameters and architecture variants.
GPT3 still makes small mistakes, like unintelligible phrases, or sentences without context, that is, it is still far behind the human brain.
The brain has 100 billion neurons, which gives some 100 trillion parameters. It still rules easily and uses very little energy (it is fed with rice and beans).
However, GPT3 shows that AI is growing in rampant orders of magnitude, and you can already see it coming!