It’s tempting to think that GP3 will solve all NLP problems but it does not

Author: ajit jaokar

In my previous blog what is driving the innovation in nlp and gpt3 , I talked about how GPT3 has evolved from the basic transformer architecture.

Based on that blog, a start-up approached me saying that they had an idea which they felt could only be implemented by GPT3.

They were eagerly waiting to be approved (isn’t everybody – he he!)

Apart from waiting for GPT3- there was another critical flaw in their argument

Their idea was not generative i.e. it did not need GPT3 in the first place (or for that matter any similar architecture)

It’s tempting to think that GPT-3 will solve all the NLP problems .. but it does not

let me explain by this what I mean by this

Below is the basic flow of NLP services and a listing of NLP applications

NLP services include:

  • Text Summarization
  • Text Generation
  • Chatbots
  • Machine Translation
  • Text to Speech
  • Text Classification
  • Sentence Similarity
  • Finding similar sentences

 

Image source – Dr Amita Kapoor

While many of these are generative- not all of them are.

The GPT3 and transformer-based applications basically address the generative elements of NLP

That still leaves a large number of other applications which use NLP but are not generative (for example Text classification or Text summarization).

 

You can also look at the same situation from the perspective of word embeddings. Word embeddings are a type of word representation that allows words with similar meaning to have a similar representation.

Historically, word2vec and GloVe have worked well for word embeddings but these were shallow approaches. Transformers solve this problem by providing a functionality similar to what we see in transfer learning for CNNs (thereby not all layers need to be trained if you use a pre-built model)

 

To conclude

Hence, we can say that GPT3 is very interesting and will continue to be so.

However, there will be always a subset of NLP applications which will not be covered by any of the transformer-based approaches because they are not generative.

 

Go to Source