GPT-3: New Applications Developed for OpenAI’s NLP Model

OpenAI’s natural language processing (NLP) model GPT-3 offers 175 billion parameters, compared with its predecessor, GPT-2’s mere 1.5 billion parameters. The result of GPT-3’s immense size has enabled it to generate human-like text based on only a few examples of a task. Now, many users have gained access to the API, and the result has been some interesting use cases and applications. But the ecosystem is still nascent and how it matures — or whether it’s superseded by another NLP model — remains to be seen.

VentureBeat reports that the two approaches for pre-training an NLP model are generalized and ungeneralized. The latter approach “has specific pretraining objectives that are aligned with a known use case” essentially going “deep in a smaller, more focused data set.” Google’s PEGASUS model, which enables text summarization, is an example of this.

The benefit is that “the ungeneralized approach … can dramatically increase accuracy for specific tasks” although it is “also significantly less flexible than a generalized model and still requires a lot of training examples before it can begin achieving accuracy.”

A generalized approach makes full use of GPT-3’s 175 billion parameters (also known as weighted connections between words), which allows it to “execute basically any NLP task with just a handful of examples, though its accuracy is not always ideal.” OpenAI admitted that GPT-3 has “notable weaknesses in text synthesis” but “has decided that going bigger is better when it comes to accuracy problems.”

Google researchers recently showcased a Switch Transformer NLP model that has 1.6 trillion parameters. Those are the two largest generalized models, with Microsoft’s Turing-NLG, for example, numbering a mere 17 billion parameters. But building more parameters is expensive; “OpenAI spent almost $12 million to train GPT-3.”

VB notes that the question remains “whether GPT-3 will be the bedrock upon which an NLP application ecosystem will rest or if newer, stronger NLP models with knock it off its throne.” Startups drawn to GPT-3’s flexibility are exploring how to use it “to power the next generation of NLP applications.”

Cherry Ventures’ Alex Schmitt compiled some interesting GPT-3 products, many of which are “broadly consumer-facing such as the ‘Love Letter Generator,’ but there are also more technical applications such as the ‘HTML Generator’.”

It adds that “a couple of the most promising early use cases are in healthcare, finance, and video meetings.” It adds that GPT-3 applications could serve a huge need “for enterprises in healthcare, financial services, and insurance” to streamline research given the huge amounts of data being generated and then summarize key findings.

“Applications built using GPT-3’s API are just starting to scratch the surface of possible use cases, and enterprises should be excited at the relative ease with which it’s becoming possible to create highly articulated NLP models,” it concludes.

Related:
OpenAI and Stanford Researchers Call for Urgent Action to Address Harms of Large Language Models Like GPT-3, VentureBeat, 2/9/21

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.