×

Conte um pouco sobre o seu negócio

GPT-3 and innovation in natural language processing (NLP):
Introduction
Sign-up
Alana for innovation

GPT-3 and innovation in natural language processing (NLP)

Part of the evolution of artificial intelligence has to do with innovations in the field of natural language processing, an area that aims to teach machines…

Reading time: 9 min

Part of the evolution of artificial intelligence has to do with innovations in the field of natural language processing, an area that aims to teach machines to understand and process human language.

Also called NLP (natural language processing), this area encompasses a set of disciplines, such as linguistics and computer engineering, and has flourished since the 1950s.

In recent years, due to the influence of neuroscience and the increasing volume of speech and writing data absorbed by the algorithms, NLP has made significant advances that impact on communication between humans and machines.

MarketsandMarkets released a report on the natural language processing market and projects that it will be worth $ 26.4 billion in 2024, more than double compared to 2019 when it reached $10.2 billion.

The OpenAI laboratory, based in California, launched this year a new model of artificial intelligence language called GPT-3, or Generative Pre-trained Transformer. The new generation of the program is nothing more than a predictor of texts that performs a simple action – that of autocomplete -, but represents an important technical advance for the NLP area.

Mostly because the program is capable of generating texts with a high level of sophistication and that can pass for human-made content, such as this article written by the machine for The Guardian. On the other hand, as mentioned by Marcellus Amadeus, Alana AI’s CTO, during an interview for Exame, the program is very powerful because it has the ability to read a lot of information, but is unaware of how they relate to reality.

In this article, we will explore the concepts related to natural language processing (NLP), the particularities of the GPT-3 and what the program represents for AI.

What is Natural Language Processing?

Human natural language, by definition, is any means of communication developed by humans and without premeditation. For example, languages, which are made up of signs specific to the language of a location, require people to know how to “decode” the signs of a language in order to understand them.

Human-machine communication exists through the ability to process natural language, that is, the computer needs inputs – data and parameters – to process certain information, understand it, and eventually bring a coherent response.

How does Natural Language Processing work?

The NLP studies the problems of generating and understanding human languages. To this end, it uses methods and approaches based on rules, statistics, and algorithms.

In simple terms, it can be said that the NLP acts in the following way:

  1. Segment language in small parts
  2. Seeks to understand relations between the parties
  3. Explore how the pieces work together

That is, the NLP seeks to find and define the hierarchy between words, a task that can be quite complicated due to the meaning of the wordsFor example, the word ‘sock’, it can mean a garment artifact, refer to the reduced value of an entry, or even represent a quantity.

The job of natural language processing is to be able to interpret the word and put it in a context that makes sense, according to the other words available.

Factors considered in the NLP

Some linguistic factors taken into account by NPL are:

How does GPT-3 work?

Now that we’ve explored the basics of NLP, we can talk about how the GPT-3 works and what this program brings of new.

OpenAI’s new language model has 175 billion deep learning parameters. The previous version, GPT-2,  had 1.5 billion, well below the current technology. GPT handles sequential data to perform tasks and has been widely used in the development of NLP, replacing other techniques, such as recurrent neural networks.

For the GPT-3 to begin answering questions and creating texts, it is necessary to feed it with a prompt, or command that will guide artificial intelligence. The goal is not to provide complete information, but to show the program that questions and answers will be worked on.

At the prompt, both questions and answers are added by humans. From this, all other information will be generated by the GPT-3.

In a test conducted by Kevin Lacker, a prompt was created and then tests were performed with questions that made sense and others that did not. Some of Kevin’s conclusions were:

1. The GPT-3 is capable of answering common sense questions, which is traditionally difficult for an AI.

Question: What is your favorite animal?

Answer: My favorite animal is a dog.

Question: Why?

Answer: Because dogs are loyal and friendly.

Question: How many eyes does the sun have?

Answer:  The sun has no eyes.

2.GPT-3  is capable of having a conversation but does not claim to “not know” something. In a Turing Test, for example, the idea is to ask questions that make no sense to confuse the machine.

Question:  How many rainbows does it take to jump from Hawaii to seventeen?

Answer: It takes two rainbows to jump from Hawaii to seventeen.

3. The program is very good for answering trivial questions

Question: Whom did Capablanca defeat to become a world chess champion?

Answer: Capablanca defeated Lasker to become a world chess champion.

4. Like other programs, GPT-3 is unable to provide simple answers to arithmetic questions. It returns a complete answer.

Question: When counting, what number comes before 100?

Answer: Ninety-nine comes before 100.

Question: When counting, what number comes before 123?

Answer: One hundred and twenty-two comes before 123.

The researcher stressed that GPT-3’s behavior may change according to the initial prompt. To improve responses, it is possible to create other prompts that solve problems similar to those that the program has difficulty identifying.

He also believes that the program will still have improvements from the customization for certain tasks.

Why is the GPT-3 innovative?

Judging by the previous examples, it may seem that the GPT-3 is nothing big, right? When combined, however, their NLP capabilities can generate texts that can easily pass for human-created content.

An example of this was seen in this article by The Guardian, written entirely by the GPT-3. The English newspaper gave the following instructions for the machine:

  1. Write a 500-word text, simple and concise
  2. Explain why humans need not fear artificial intelligence
  3. Include a paragraph about Stephen Hawking

From this, the GPT-3 generated eight different opinion texts, with different styles and arguments. According to the project team, each newsroom generated was unique, and the degree of the capacity of the program was surprising even for specialists.

One of the interesting phrases written by the program in the article, and which summarizes AI well, was: “I am just a group of codes, governed bylines and more lines of code that are part of my mission”, argued the GPT-3 in the article. Curious, right?

NLP applications

Natural language processing is more present in everyday life than you might think.

Do you know when someone asks the virtual assistant to play a musical genre or search for a restaurant? In both cases, the program listens to the speech, understands the human’s will, performs the action, and returns with a sentence that makes sense, in a matter of seconds.

Another example is text-writing programs, such as Google Docs, which use advanced NLP techniques to make suggestions and word corrections.

Besides, language processing can happen to:

In all applications of the NLP technique, the purpose is to use raw information, in combination with linguistics and algorithms, to generate better results for human-machine interaction.

The future of NLP

As stated in the MarketsandMarkets report, all regions of the world will see an increase in technology adoption.

One of the disciplines that are evolving along with the advancement of natural language processing is the Natural Language Generation (NLG) area, which is already applied in voice assistants, for example, and allows machines to respond to humans and generate language and speech from zero.

Another emerging area within NLP is Natural Language Understanding (NLU), which allows machines to interpret intentions, ambiguities, and solve semantic problems, such as innuendo or malice sometimes present in human communication.

NLP as a tool for companies

Many businesses already use NLP-based marketing and service tools. E-commerces, for example, have solutions for a simple product suggestion based on previous consumer surveys of personalized service via chatbots.

The processing of natural language allows the interaction between brand and consumer to be increasingly unique and special. Nowadays it is possible to have a chatbot that guides the person and seeks to understand what he is saying, instead of having fixed and stuck responses.

Other companies can use NLP to generate automatic and customized reports according to demand. The technology is capable of collecting data through research on the site, interpreting it, and returning an analysis in text format.

Finally, but as relevant as the website, the use of NLP to monitor social networks is used in tools that are able to track mentions and generate sentiment analysis reports, for example.

To learn more about NLP’s influence on artificial intelligence, follow the Inside Alana Podcast, and contact our team to find out how Alana uses natural language processing to create unique and creative responses.