GPT-3 makes irrelevant years worth of research in NLP, nobody's using the same approach as before 2020. That made many researchers and ML engineers hard earned skills obsolete. We have to start over from scratch now, this time is very different. We have clearly passed the knee of the curve for NLP. I remember about 2018 I was talking with my teammates about CV and NLP, saying that all the cool papers are in CV and NLP is 5 years behind. Look at it now.
The field of Natural Language Processing (NLP) has seen significant advancements in recent years. Previously, supervised learning techniques were commonly used with large datasets and few classes. However, these techniques have become less popular as unsupervised learning methods have become more prominent. These methods often require large amounts of compute power and data, making them more difficult to implement. In addition, the focus of NLP research has shifted from creating new network architectures for specific tasks to improving the efficiency and scalability of existing models. As a result, the field of NLP has become more focused on engineering and less on inventing new architectures, leading some researchers to view it as less exciting than it used to be.
LSTMs are out, large supervised datasets with few classes for each task are out, architectural hacking is out. Now we got prompting.