Some pretty impressive machine-learning generated poetry courtesy of GPT-2


GPT-2 is Open AI's language-generation model (last seen around these parts as a means of detecting machine-generated text); it's powerful and cool, and Gwern Branwen fed it the Project Gutenberg poetry corpus to see what kind of poetry it would write.


Pretty good poetry, as it turns out.


Scott Alexander (previously) does a good job of highlighting its greatest hits (and misses).

GPT-2 poetry manages some impressive accomplishments, like solid iambic pentameter ("Thou know'st how Menoetiades the swift/Was dragged, of Hector and the fierce compeers/And Phrygian warriors. So, we will dispatch/Your bodies, then, yourselves to burn the ships/In sacrifice; with torches and with bells") though it struggles with extended/complex rhyming ("If sick of sense; Wholubil, or snug. ills, we know our own our first in sense the worse maintained between the worse, soon expired").

Sometimes, it gets stuck (there's a poem that's just "The Emperor Wu (the great Wu), majestical," repeated 11 times, then a bunch of repetitions of "The Emperor Wu (the great Wu), rapacious," salted with the odd "majestical"), but some of it is actually gorgeous:

There are several kinds of people in America;
There are several kinds of people, I mean their number.
There's a girl growing up in the house by the light,
There's a youth upon the road, or a girl somewhere in New York;
There's a prettier girl, and a man more congenial,

But none of the likes of the likes of the fellows are equal.
There's one who has never been married and married,
There's one who don't want to be treated with kindness;
A fair youth is never employed nor neglected;
There's one who has never yet come to a neighbor,v
And one who resides in New York from the start;

But none of the likes of the likes of the fellows
Are equal to him, and wherever he goes,
The heart somehow breaks under the hand that is steering; And so it is with me

As with a lot of machine-learning applications, I suspect that this one would be best served as the basis for a centaur, with the model churning out tons of poetry and a human acting as a discerner to throw away the garbage and pluck out the happy accidents.

Finetuning the GPT-2-small Transformer for English Poetry Generation [Gwern Branwen/Gwern.net]