What Is GPT-3? Why Should You Care?

The AI that sounds human.

Share on twitter
Tweet
Share on facebook
Share
Share on whatsapp
WhatsApp
what is GPT-3 featured image
Image by Manik Berry/Fossbytes

GPT-3 is a popular name in the AI market, and GPT-3 stands for Generative Pre-trained Transformer 3. It is an AI-backed language model that can generate human-like text responses from the data it has been fed.

The AI was developed by OpenAI, which was co-founded by Elon Musk. There have been similar AI models in the past, but GPT-3 comes eerily close to creating human-like messages.

What can GPT-3 do?

AI imaginary friend tries to kill creator
A screenshot of Lucas Rizzotto’s tweet

It can create natural-sounding text out of the data that has been fed to it. The official website says over 300 applications use GPT-3 to power their search, conversation, and text completion. The OpenAI API currently generates 4.5 billion words per day and continues to scale it further.

For instance, if you have 500 customer feedback and you deploy GPT-3 to read them, the AI will read these for you and answer your questions. A company called ‘Viable’ is using the model to go through customer feedback, and here’s what it is doing for the company:

If asked, “What’s frustrating our customers about the checkout experience?”, Viable might provide the insight: Customers are frustrated with the checkout flow because it takes too long to load. They also want a way to edit their address in checkout and save multiple payment methods.

Excerpt from GPT-3 website

GPT-3 is also capable of generating long-form fiction, non-fiction, and even writing code for apps. It is so capable, that it has written a full article for The Guardian that you can read here. But some perils come with this model.

A developer brought his childhood imaginary friend to life with the help of this AI, and it tried to kill him. So there are still things that we need to iron out, but it is perfect.

How does GPT-3 work?

GPT-3 stands for Generative Pre-trained Transformer 3, which also shows a limitation of the model. The GPT-3 model can only generate from what it is fed and doesn’t learn anything new. However, that doesn’t make it any less capable, as OpenAI has fed it with 570GB of data from the internet, including Wikipedia.

The algorithms in GPT-3 allow it to use the data it has to perform/predict tasks. The model has 175 billion parameters that allow it to make decisions. These algorithms also allow GPT-3 to translate between languages or turn regular text into code. GPT-3 is currently closed and available to developers only. It is a powerful tool and can be the tipping point toward AI dependency.

Let us know what you think about GPT-3 and its extraordinary capabilities in the comments.

If you like this simple explainer, check out our Short Bytes section. We take complex tech topics and break them into short, easy-to-understand articles.

Manik Berry

Manik Berry

With a Master’s degree in journalism, Manik writes about big tech and has a keen eye for political-tech news. In his free time, he’s browsing the Kindle store for new stuff read. Manik also adores his motorcycle and is looking for new routes on weekends. He likes tea and cat memes. You can reach him at [email protected]
More From Fossbytes

Latest On Fossbytes

Find your dream job