How to Fine-Tune GPT-2 for Text Generation

Using GPT-2 to generate quality song lyrics

François St-Amant
Towards Data Science
6 min readMay 8, 2021

--

Source: https://unsplash.com/photos/gUK3lA3K7Yo

Natural Language Generation (NLG) has made incredible strides in recent years. In early 2019, OpenAI released GPT-2, a huge pretrained model (1.5B parameters) capable of generating text of human-like quality.

Generative Pretrained Transformer 2 (GPT-2) is, like the name says, based on the…

--

--