Beta Testers Give Thumbs Up to New OpenAI Text Generator

OpenAI’s Generative Pre-trained Transformer (GPT), a general-purpose language algorithm for using machine learning to answer questions, translate text and predictively write it, is currently in its third version. GPT-3, first described in a research paper published in May, is now in a private beta with a select group of developers. The goal is to eventually launch it as a commercial cloud-based subscription service. Its predecessor, GPT-2, released last year, was able to create convincing text in several styles.

MIT Technology Review reports that, according to beta testers, GPT-3 is an impressive improvement. As developer and artist Arram Sabeti said, “playing with GPT-3 feels like seeing the future.” Sabeti has used GPT-3 to write short stories, songs, press releases and technical manuals. Artist Mario Klingemann gave GPT-3 only the title, author’s name and the first word to write a story in the style of humorist Jerome K. Jerome. GPT-3 even wrote a “reasonably informative article” about itself.

Others, like web developer Sharif Shameem, learned that GPT-3 can generate anything from guitar tables to computer code. “The recent, almost accidental, discovery that GPT-3 can sort of write code does generate a slight shiver,” said 3D graphics pioneer John Carmack, who is a consulting chief technology officer at Oculus VR.

GPT-3 is, however, “still prone to spewing hateful sexist and racist language” as did GPT-2 until that model was tweaked. MIT Technology Review noted that GPT-3’s positive results are due to “excellent engineering, not genuine smarts … [and] even its successes have a lack of depth to them.” Its strength appears to be “synthesizing text it has found elsewhere on the Internet, making it a kind of vast, eclectic scrapbook created from millions and millions of snippets of text.”

OpenAI co-founder Sam Altman agreed that, “the GPT-3 hype is way too much.” “It’s impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes,” he said. “AI is going to change the world, but GPT-3 is just a very early glimpse.”

SiliconANGLE reports that GPT-3 “works by analyzing a sequence of words, text or other data, then expanding on these examples to produce entirely original output in the form of an article or an image.” GPT-2 got a controversial reception, it adds, due to “its ability to create extremely realistic and coherent ‘fake news’ articles based on something as simple as an opening sentence.” For that reason, OpenAI did not make the algorithm publicly available.

Since GPT-3 has been in the hands of a select group of people, generated texts have been circulating. Founders Fund principal Delian Asparouhov fed GPT-3 half of two company documents and, in both cases, it generated “not just coherent, additional paragraphs of text, but also could follow the prior formatting in such a way as to make it almost indistinguishable from the original, human written text.”

With its 175 billion learning parameters, GPT-3 can “perform pretty much any task it’s assigned … [making] it an order of magnitude larger than the second-most powerful language model, Microsoft’s Turing-NLG algorithm, which has just 17 billion parameters.” SiliconANGLE reveals that the “paid version” of GPT-3 “should be released in about two months.”

Related:
Did a Person Write This Headline, or a Machine?, Wired, 7/22/20
OpenAI’s GPT-3 May Be the Biggest Thing Since Bitcoin, Manuel Araoz, 7/18/20
GPT-3 Is Amazing — and Overhyped, Forbes, 7/18/20

No Comments Yet

You can be the first to comment!

Sorry, comments for this entry are closed at this time.