GPT-2 vs GPT-3: The OpenAI Showdown

Woman at a blackboard

GPT-2 vs GPT-3: The OpenAI Showdown. The Generative Pre-Trained Transformer (GPT) is an innovation in the Natural Language Processing (NLP) space developed by OpenAI. These models are known to be the most advanced of its kind and can even be dangerous in the wrong hands. It is an unsupervised generative model which means that it takes an input such as a sentence and tries to generate an appropriate response, and the data used for its training is not labelled.

Read More

Getting started with 5 essential Natural Language Processing libraries

This article is an overview of how to get started with 5 popular Python NLP libraries, from those for linguistic data visualization, to data preprocessing, to multi-task functionality, to state of the art language modeling, and beyond.

Read More

Vision Transformers: Natural Language Processing (NLP) increases efficiency and model generality

Vision Transformers: Natural Language Processing (NLP) Increases Efficiency and Model Generality. Why do we hear so little about transformer models applied to computer vision tasks? What about attention in computer vision networks? Transformers Are for Natural Language Processing (NLP), Right?

Read More

Six times bigger than GPT-3: Inside Google’s TRILLION parameter switch transformer model

Six Times Bigger than GPT-3: Inside Google’s TRILLION Parameter Switch Transformer Model. OpenAI’s GPT-3 is, arguably , the most famous deep learning models created in the last few years. One of the things that impresses the most about GPT-3 is its size. In some context, GPT-3 is nothing but GPT-2 with a lot of more parameters. With 175 billion parameters, GPT-3 was about four times bigger than its largest predecessor.

Read More