Can AI Write An Article Or Complete An Image? Yes, Says Openai’s Gpt-3

Can AI Write An Article Or Complete An Image? Yes, Says Openai’s Gpt-3

Artificial intelligence has challenged the unchartered territory that was ruled by the human mind- Cognitive Creativity

The non-profit artificial intelligence (AI) research company OpenAI, which is backed by names like Peter Thiel, Elon Musk, Reid Hoffman, Marc Benioff, and Sam Altman, has released GPT-3, the company's third-generation language prediction model. GPT-3 is the largest language model ever created and is capable of generating text that is indistinguishable from human text in many cases. OpenAI described the language prediction technology for the first time in a research paper back in May.

But last week it began drip-feeding the software to selected people who requested access to a private beta. For now, OpenAI wants outside developers to help it explore what GPT-3 can do, but it plans to turn the tool into a commercial product later this year, offering businesses a paid-for subscription to the AI via the cloud.

The Power of Language Models

GPT-3 is the most powerful language model ever. Its predecessor, GPT-2, released last year, was already able to spit out convincing streams of text in a range of different styles when prompted with an opening sentence. But GPT-3 is a big leap forward. The model has 175 billion parameters (the values that a neural network tries to optimize during training), compared with GPT-2's already vast 1.5 billion.

AI has been demonstrated to create short stories, songs, press releases, and technical manuals. Not only can the technology create stories, but it can do so while using language that is relatable to specific writers. The technology only required the title, author's name, and the initial word. GPT-3 is also capable of generating other text like guitar tabs and computer code.

AI has been demonstrated to create short stories, songs, press releases, and technical manuals. Not only can the technology create stories, but it can do so while using language that is relatable to specific writers. The technology only required the title, author's name, and the initial word. GPT-3 is also capable of generating other text like guitar tabs and computer code.

Output inspired by Human Intelligence

It's also no surprise that many have been quick to start talking about intelligence. But GPT-3's human-like output and striking versatility are the results of excellent engineering, not genuine smarts. For one thing, the AI still makes ridiculous howlers that reveal a total lack of common sense. But even its successes have a lack of depth to them, reading more like cut-and-paste jobs than original compositions.

However, there are still some concerns about GPT-3 and what level of bias or sexist and racist language it can produce. This type of problem was addressed in the GPT-2 model, so it is not a brand-new issue.

GPT-3 is not intelligent and makes a lot of mistakes that a human would not, but the engineering is outstanding. The technology is extremely good at synthesizing text on the internet and picking up millions of pieces of text that it can piece together.

For this reason, people like Sam Altman, co-founder of OpenAI with Elon Musk, was quick to lessen some of the attention that the technology was receiving. "The GPT-3 hype is way too much. It's impressive (thanks for the nice compliments!) but it still has serious weaknesses and sometimes makes very silly mistakes. AI is going to change the world, but GPT-3 is just a very early glimpse. We have a lot to still figure out," he tweeted out on July 19.

GPT-3 is a massive step forward in artificial intelligence and language prediction technology, but still there is a long way to go for its ultimate success.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net