Generative Pre-trained Transformer 2 is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on a dataset of 8 million web pages. From Wikipedia
The model, set for release later this year, represents a strategic pivot to openness as OpenAI engages developers globally for feedback.