Generative Pre-trained Transformer 2 is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on a dataset of 8 million web pages. From Wikipedia
The model serves as a reproducible testbed with external verification of its privacy guarantees.