Some pretty impressive machine-learning generated poetry courtesy of GPT-2

GPT-2 is Open AI's language-generation model (last seen around these parts as a means of detecting machine-generated text); it's powerful and cool, and Gwern Branwen fed it the Project Gutenberg poetry corpus to see what kind of poetry it would write. Pretty good poetry, as it turns out. Scott Alexander (previously) does a good job … Continue reading Some pretty impressive machine-learning generated poetry courtesy of GPT-2