GPT-3 is one of the highlights in the AI space this 2021. Until last week, its access was restricted and people and companies had to join a waitlist. Not anymore, OpenAI announced on November 18th that it is fully available for everyone. One of the main concerns about GPT-3 was safety and how the technology could be used for applications like question-answer or how to filter certain content. OpenAI made progress with safeguards to make possible General Availability to the public. But let's step back and tell the full story.
What is GPT-3?
Generative Pre-Trained Transformer 3 (GPT-3) is an autoregressive language model that uses deep learning to produce text. OpenAI is behind it, an organization founded in San Francisco in 2015 by Elon Musk among others. GPT-3 is the 3rd generation of the model in the GPT-n series and it was trained with 175 billion machine learning parameters - bigger than any language model pre-trained to date. That gave GPT-3 the ability to perform tasks with surprising high-quality results. This kind of R&D and Deep Learning training is not realistically possible for many companies due to its complexity and cost. OpenAI presented a paper in 2018 titled Improving Language Understanding by Generative Pre-Training and that is how it all started.
To keep it simple, GPT-3 generates text like a human. That simple capability can be used for many different use cases across all industries.
Use Cases
OpenAI announced the beta of the API and became widely popular. The fact that it had restricted access also created some additional hype on social media as more and more people couldn't wait to access it.
People with access started showing amazing examples of GPT-3 to create articles, social media content, poetry, stories, text summaries, dialogs, write complex queries in SQL, and more.
Then some startups started to appear, building entire fast-growing businesses on top of this API. A great example is Copy.ai, whose founders were able to create a tool to help copywriters and grow it to 150k users, $1M ARR, and raise a $2.9M round in just 6 months. We’ve also seen some exits already, like Headlime which was quickly acquired by Jarvis.ai after a short but very successful ride.
Maybe one of the most mind-blowing examples of GPT-3 is Codex. The system translated natural language to code. This basically provides a way to generate Data Science code without writing any code, Codex will do it for you!
How to use GPT-3?
One of the most exciting features of GPT-3 is its readiness. You don’t need to train it, it is already pre-trained. You don’t need to label data or have thousands of records to input into it so it performs well. With just a few examples, GPT-3 already provides a good solution.
We will be providing a full tutorial on GPT-3 but for now, the best way to get started with the API is to create a free account. Simply go to Openai.com and get started.