Some pretty impressive machine-learning generated poetry courtesy of GPT-2

GPT-2 is Open AI's language-generation model (last seen around these parts as a means of detecting machine-generated text); it's powerful and cool, and Gwern Branwen fed it the Project Gutenberg poetry corpus to see what kind of poetry it would write. Read the rest

A machine-learning system that guesses whether text was produced by machine-learning systems

Gltr is an MIT-IBM Watson Lab/Harvard NLP joint project that analyzes texts and predicts whether that text was generated by a machine-learning model. Read the rest