I've been thinking about GPT-2 off and on lately, wondering what near-term value I might put it to. Then a friend pointed me to AI Dungeon, an exploration from the BYU Perception, Control, and Cognition Lab, which uses GPT-2 to generate an open-world text adventure game. It's not perfect, but it is pretty impressive and fun.

Here's my first interaction with it:

Generating story...
You are Daniel, a wizard living in the kingdom of Larion. You have a staff and
 a spellbook. You finish your long journey and finally arrive at the ruin you've
 been looking for. You look around and see that it's not much different than
 when you left it. A few trees here and there, but nothing has changed since
 then.
As you enter the ruins, you hear some noise outside. It sounds like voices
> hide
You quickly turn to run away from the sound, but you don't get far before you're
 surrounded by several men with guns. They point their weapons at you as they
 demand to know where you came from.
> say "I am Daniel, wizard of Larion! And who, may I ask, are you?"
"Who are you?" You say, trying to keep calm.
"We are the guards assigned to protect this place. We have come here because
 we heard the voices coming from inside". One of them says.
> say "Certainly then we are friends, for I have sought these ruins for years, only to learn from them their wisdom."
The man looks at you with surprise. He is obviously surprised that you would
 seek out the ruins.
"Friends? What do you mean friend? These ruins were built by an ancient race
 long dead. How could you possibly be friends with us?" he asks
> say "I see no conflict. You guard the ruins, and I seek their wisdom as an end in itself. How else did you become their guardians but by valuing them on their own merits and for no mercenary purpose?"
He seems to take it all in, and nods his head.
"Very well, we will allow you to pass. But if you find anything strange or dangerous,
 please tell me immediately. I want to know what has happened here".
> Walk in
You enter the building, which is quite large. The walls are made of stone, and
 there are many doors, each with a different symbol on them. There are also many
 statues of wizards standing around, some of them holding books

Things go down-hill from there, though unfortunately I didn't think to save the rest. They have others' recorded sessions at the main site.

I do notice that, like other GPT-2 examples, it often confuses pronoun referents, (a common NLP problem in general), plurality, and macro-scale plot points, but I think harping on its imperfections is beside the point. The number of times that it gets the above things right is shocking, and enough for the game to be playable. The PCCL fine-tuned GPT-2 on a text adventure corpus, and it seems likely that further improvements could be made to the game with more fine-tuning and perhaps some post-processing.

Now, I don't particularly care about text adventure games per se, but what really gets me about AI Dungeon is its demonstration of the enormous potential for improved language models to interact naturally with humans.

Chat bots are terrible. Really terrible. But we can improve that with Transformer-based models, apparently. Likewise, search has been pretty terrible at figuring out what a naive searcher wants, but Transformers are being applied to that problem too in actual, popular search engines. In just the past two months, both Google and Bing have both integrated BERT to improve natural-language-search understanding. The original BERT paper was released in October of 2018, so that's under a year from a breakthrough in attentional models, and production code.

One other thing I like is that the game is hosted and run from inside Google Colab. Text adventures (and other text-based human interaction experiments) don't require a lot of UI, so that strikes me as a great idea that many others should emulate.

You can peruse the AI Dungeon source code here.

Play a quick game yourself and post your results in the comments!