GPT-3 is not merely an autocomplete program, like the one on Google’s search bar!
Developed by Elon Musk-owned OpenAI, GPT-3 is the autoregressive language model that deploys deep learning to produce human-like text. OpenAI’s GPT-3 is currently the largest artificial intelligence language model, marred in debates that range from whether it is a step closer to AGI (Artificial General Intelligence) or it is the first step toward creating this sort of superintelligence.
GPT-3 (“generative pre-trained transformer.”) is the third in a series of autocomplete tools designed by OpenAI. GPT-3 program has been trained on a huge corpus of text stored as billions of weighted connections between the different nodes in GPT-3’s neural network. The program looks and finds patterns without any guidance, which it then uses to complete text prompts. If you input the word “fire” into GPT-3, the program knows, based on the weights in its network, that the words “alarm” and “water” are much more likely to follow than “soil” or “forests.”
Training Data behind AI Tool GPT-3
GPT-3is trained on 175 billion parameters that are more than 100 times more than its predecessor and ten times more than comparable programs, to complete a mind-boggling array of autocomplete tasks, whose sharpness astonishes mankind!
The dataset GPT-3 was entirety trained on-
• The English Wikipedia, spanning some 6 million articles which makes up only 0.6 percent of its training data.
• Digitized books and various web links, including news articles, recipes, and poetry, coding manuals, fanfiction, religious prophecy, and whatever else imaginable!
• Any type of good and bad text that has been uploaded on the internet including the potentially harmful conspiracy theories, racist screeds, pseudoscientific textbooks, and the manifestos of mass shooters.
10 Use Cases of AI Tool GPT-3
It’s hardly comprehensive, but here’s a small sample of things people have created with GPT-3:
• A chatbot that talks to historical figures
Because GPT-3 has been trained on so many digitized books, it has assimilated a fair amount of knowledge relevant to specific thinkers. Leverage GPT-3 to make a chatbot talk like the philosopher Bertrand Russell, and ask him to explain his views. Fictional characters are as accessible to GPT-3 as historical ones. Check out the exciting dialogue between Alan Turing and Claude Shannon, interrupted by Harry Potter!
• Makes your own quizzes
Definitely a blessing to the education system, GPT-3 is an awesome helper of teachers as well as students. It will generate Quizzes for practice on any topics and also explain the answers to these questions in detail, helping students to learn anything from anyone be it robotics from Elon Musk, physics from Newton, relativity theory from Einstein, and literature from Shakespeare.
• A question-based search engine
Trained on the entire Wikipedia, GPT-3 is like Google but for questions and answers. Type a question and GPT-3 directs you to the relevant Wikipedia URL for the answer.
• Answer medical queries
A medical student from the UK used GPT-3 to answer health care questions. The program not only gave the right answer but correctly explained the underlying biological mechanism.
• Style transfer for text
The input text is written in a certain style and GPT-3 can change it to another. In an example on Twitter, a user input text in “plain language” and asked GPT-3 to change it to “legal language.” This transforms inputs from “my landlord didn’t maintain the property” to “The Defendants have permitted the real property to fall into disrepair and have failed to comply with state and local health and safety codes and regulations.”
• Compose its own Music
Guitar tabs are shared on the web using ASCII text files, which comprise part of GPT-3’s training dataset. Naturally, that means GPT-3 can generate music itself after being given a few chords to start.
• Write creative fiction
This is a wide-ranging area within GPT-3’s skillset but an incredibly impressive one. The best collection of the program’s literary samples comes from independent researcher and writer Gwern Branwen who has collected a trove of GPT-3’s writing. It ranges from a type of one-sentence pun known as a Tom Swifty to poetry in the style of Allen Ginsberg, T.S. Eliot, and Emily Dickinson to Navy SEAL copypasta.
• Autocomplete images, not just text
The basic GPT architecture can be retrained on pixels instead of words, allowing it to perform the same autocomplete tasks with visual data than it does with text input.
• Solving language and syntax puzzles
You can show GPT-3 certain linguistic patterns (Like “truck driver becomes driver of truck” and “chocolate cake becomes cake made of chocolate”) and it will complete any new prompts you show it correctly. However, being in the nascent stage, a lot of developments are still bound to happen. As computer science professor Yoav Goldberg who’s been sharing lots of these examples on Twitter puts it, “such abilities are new and super exciting for AI, but they don’t mean GPT-3 has mastered language”.
• Code generation based on text descriptions
Describe a design element or page layout of your choice in simple words and GPT-3 spits out the relevant code. Users have used GPT-3 to generate code for a machine learning model, just by describing the dataset and required output. In another example, in a layout generator, you have to describe any layout you want, and GPT-3 will generate the JSX code for you.
A world of Unlimited Possibilities has just Begun!