A generative pre-trained transformer is a type of large language model that is widely used in generative AI chatbots. GPTs are based on a deep learning architecture called the transformer. From Wikipedia
The findings stem from a 303‑query audit that used an LLM to score answers, a choice some experts and vendors say requires stronger validation.