View our Guide How to Build a Marketplace in 10 weeks x
You have probably heard about an innovative language model called GPT3. The hype is so overwhelming that we decided to research its core and the consequences for the tech players. Let’s explore whether the language deserves this much attention and what makes it so exceptional.
GPT-3 is a text generating neural network that was released in June 2020 and tested for $14 million. Its creator is the AI research agency OpenAI headed by Sam Altman, Marc Benioff, Elon Musk, and Reid Hoffman.
The language is based on 175 billion parameters and is by far more accurate than its predecessors. For example, GPT-2 had only 1.5 billion of parameters, and Microsoft Turing NLG - 17 billion of them. Thus, the power of GPT-3 is significantly surpassing the alternatives.
Parameters are network calculations that apply particular weights to different aspects of data. Thus, every data aspect receives its value and data perspective. Thanks to this massive amount of data, the language is capable of meta-learning. It means that GPT-3 can do tasks without any training, from a single example.
OpenAI GPT-3 also has some advanced creative capabilities due to its context-based nature. Once a user puts in a request, the language analyses it and provides the most probable answer. The text predictor processes all of the text existing on the Internet, calculating the most statistically expected output.
Therefore, GPT-3 is extremely powerful without understanding a single word it produces. Without specific tuning & adjustments, the language can write stories, blog articles, PR materials, and even technical documentation. Most of the time, the outputs feel very similar to those written by a human.
Another attempt at a longer piece. An imaginary Jerome K. Jerome writes about Twitter. All I seeded was the title, the author's name and the first "It", the rest is done by #gpt3
— Mario Klingemann (@quasimondo) July 18, 2020
Here is the full-length version as a PDF:https://t.co/d2gpmlZ1T5 pic.twitter.com/1N0lNoC1eZ
In another experiment, a user put in only a title and the article summary and received a full article surpassing his expectations.
How about a dozen business ideas generated from a simple list of data points?
Tentstarter - Allows investors to fund full-time homesteading, with the tent itself serving as a financial instrument.
— joshua schachter (@joshu) July 15, 2020
Tindermom - Tinder, but for mothers seeking help with their tech support problems.
— joshua schachter (@joshu) July 15, 2020
Entrepreneurs in search of inspiration may desperately need such an idea generator.
GPT-3 can even produce a song by its title and artist name.
All of this may sound like a fantasy, even though these are real examples of GPT-3 applications.
Currently, GPT-3 is available as an API in a private beta version. Designer Jordan Singer used GPT-3 to build a Figma plugin and called it “Designer”. The plugin can produce the app design from a simple design description. Here is an example of how you can create Instagram with this API:
This changes everything. 🤯
— Jordan Singer (@jsngr) July 18, 2020
With GPT-3, I built a Figma plugin to design for you.
I call it "Designer" pic.twitter.com/OzW1sKNLEC
Mind-blowing, you may think. Will GPT replace our jobs? Well, as far as the evolution of NLP languages continues, some changes are inevitable. However, the GPT-3 language is still too far away from perfection. It does make some critical mistakes that we will study below.
Recent experiments with the language prove that AI can simplify the work of developers by producing custom code. For instance, here is how Sharif Shameem auto-tuned OpenAI language to generate various layouts from simple requests:
This is mind blowing.
— Sharif Shameem (@sharifshameem) July 13, 2020
With GPT-3, I built a layout generator where you just describe any layout you want, and it generates the JSX code for you.
W H A T pic.twitter.com/w8JkrZO4lk
GPT-3 can code in Python, CSS, JSX. Eventually, learning Java, .Net, and other languages may become unnecessary as there will be one coding language for all cases. Serving as a universal tool for programmers, GPT-3 is another step forward to this simple interaction with software systems. In the long term, it may change the whole industry.
The opinions on the role of GPT-3 differ. In a recent interview, Shameem stated that it could reduce the skills required to become a programmer. Alternatively, it can also grow their productivity and erase the need for low-skilled engineers.
Before you get overwhelmed by the news, let’s outline the weaknesses of this language. Even though GPT-3 language is a logical advancement to intelligent AI, it lacks accuracy in adversarial natural language inference (NLI). The goal of the NLI test is to find a relation between two statements.
“GPT-3 appears to be weak in the few-shot or one-shot setting at some tasks that involve comparing two sentences or snippets, for example, whether a word is used the same way in two sentences (WiC), whether one sentence is a paraphrase of another, or whether one sentence implies another,” a study reveals.
The language lacks human understanding and common sense. Some mistakes include racism and intolerance. For instance, the auto-generated tweets form the word “black” are: “Black is to white as down is to up” and “#blacklivesmatter is a harmful campaign”.
The user reply predictions are often false or grammatically incorrect. Unlike humans, the system lacks a profound understanding of the context and sometimes fails to respond to unusual requests.
This post is one of the best GPT-3 evaluations I've seen. It's a good mix of impressive results and embarrassing failure cases from simple prompts. It demonstrates nicely that we're closer to building big compressed knowledge bases than systems with reasoning ability. https://t.co/a5Nq006dMD
— Denny Britz (@dennybritz) July 17, 2020
Thus, we can sum up that analyzing terabytes of data is not enough to figure out the underlying human intent. In other words, the hype around GPT-3 is a little too much, as there is still much work to do.
Even the CEO of OpenAI agrees with this statement. Yet, the language is a significant advancement for natural language processing.
The hype around GPT-3 language may be explained by its human-like responding and fast learning capabilities. Unlike other models such as BERT, the OpenAI GPT-3 doesn’t need hundreds of examples & days of training to learn. You can ask it to perform a task and get the answer instantly. It is a considerable enhancement for data scientists that gets them excited like no other tool, at least for now.
and get the latest updates