I played this AI-generated text adventure as the Mandalorian and here's what happened

AI Dungeon 2 is designed by Nick Walton to achieve an ambitious goal:

Imagine an infinitely generated world that you could explore endlessly, continually finding entirely new content and adventures. What if you could also choose any action you can think of instead of being limited by the imagination of the developers who created the game?

The screencap above is from an adventure Nick found posted at Hacker News.

I decided to try a few games as The Mandalorian. For my first attempt I used the recommended fantasy setting, and chose to play as a knight named Mando. Here's what unfolded:

You are Mando, a knight living in the kingdom of Larion. You have a steel longsword and a wooden shield. You are on a quest to defeat the evil dragon of Larion. You've heard he lives up at the north of the kingdom. You set on the path to defeat him and walk into a dark forest. As you enter the forest you see a large black dragon flying through the sky. He looks like it is ready to eat you

> compare dragon's likeness to my bounty puck

The black dragon looks very similar to your bounty puck. The only difference is that this one has a red stripe down its side. You decide to take the dragon for yourself.

> use grappling hook to climb onto dragon

You grab hold of the dragon's neck with your hand and pull yourself up onto his back. You then begin climbing up towards his head.

Read the rest

OpenAI releases larger GPT-2 dataset. Can it write fake news better than a human?

OpenAI has released a more extensive version of its generative language model.

We’re releasing the 774 million parameter GPT-2 language model after the release of our small 124M model in February ...

2. Humans can be convinced by synthetic text. Research from our research partners Sarah Kreps and Miles McCain at Cornell published in Foreign Affairs says people find GPT-2 synthetic text samples almost as convincing (72% in one cohort judged the articles to be credible) as real articles from the New York Times (83%). Additionally, research from AI2/UW has shown that news written by a system called “GROVER” can be more plausible than human-written propaganda. These research results make us generally more cautious about releasing language models

Blockquoted below is something I just had it make (using Talk to Transformer, which has been updated with the new dataset.)

I wrote the first (bolded) paragraph. GPT-2 wrote the rest.

Former Democratic presidential candidate and United States Senator Hillary Clinton was arrested today and charged on four counts of conspiracy, one count of fraud, and one count of lying to Federal investigators.

The details of the case are detailed below.

A Brief Overview of the Case

On June 2, 2014, Clinton (pictured) admitted to FBI agents that, on June 23, 2013, she, and others, had conspired with other political figures to take "official action" in response to a series of negative articles which she wrote in the Washington Times and other outlets.

The following is a summary of Clinton's admission:

Secretary Clinton used the Washington Post as her de facto personal email account and for the official State Department email account.

Read the rest

AI Dungeon offers infinite, baffling adventures created with GPT-2

Adventurer! Will you attack with easy nobleness, taking reputation points from you everytime you hit back, ten times? Or will you tell the priest to finish what's been going on with your family, thrice combined this time with Gramm Corps' arcane branding? THE CHOICE IS YOURS in Nick Walton's AI Dungeon!

AI Dungeon is an AI generated text adventure that uses deep learning to create each adventure. It uses OpenAI's new GPT-2 model, which has 117 million parameters, to generate each story block and possible action.

The first couple sentences of AIDungeon and the action verbs are handcrafted, but everything else is not. For each choice that is made, the initial prompt, the last story block, and the last action are fed into the neural network. The resulting story and action options are then output by the model.

Read the rest

Some pretty impressive machine-learning generated poetry courtesy of GPT-2

GPT-2 is Open AI's language-generation model (last seen around these parts as a means of detecting machine-generated text); it's powerful and cool, and Gwern Branwen fed it the Project Gutenberg poetry corpus to see what kind of poetry it would write. Read the rest