GPT

Generative Pre-trained Transformer — a family of decoder-only language models that generate text by predicting the next token.