I was working on a series of blog posts last week, but I got distracted by Game Chef. Here’s the result — Big Chiefs, a game where you play the magical white guy who joins up with a “primitive” native tribe, learns their ways, and leads them to victory against the other, less magical white guys. There are a lot of movies with this basic plot, and the sad part is, a lot of them appear to be written out of a genuine interest in the subject matter; the problem is, they can’t seem to write about people of color without introducing a white guy to be the protagonist so that “the audience has someone to identify with.” So this is the game I wrote about that, where the people of color in your tribe are at best foils to provide you adversity, and at worst gambling chips. If anybody knows Kevin Costner and thinks he would want to give me a pull quote for this game, let me know.
So in Bioshock, a game I very much enjoyed, you are offered the opportunity early in the game and periodically throughout to choose between two different approaches to a problem: one which is apparently moral (although your adviser calls it naive) and one apparently immoral (although your adviser deems it utilitarian). The moral approach, Option X, provides you with about half the gain of Option Y, the immoral approach — and so there appears to be a choice between doing the right thing and paying for it, or doing the wrong thing and profiting. It’s not just a game, it’s a simulation!
In reality, though, it turns out that Option X just delays part of the reward and replaces the rest with specific tonics, plasmids and other prechosen fringe benefits, such that your overall reward for consistently choosing Option X is probably, all told, at least as high or higher than Option Y. Being moral isn’t a sacrifice in Bioshock; it’s a long-term investment, and the question isn’t how willing you are to do the right thing but how well you read the FAQ or how good you are at putting off short-term gain for future returns. It’s a useful lesson for anybody thinking about the stock market, but as a narrative statement, it’s a failure.
This is the corollary of my previous post. Many games seem to embrace the idea that you should reward people for making good moral choices. I call this “Disney morality,” because it shows up so often in those movies: being virtuous leads to material success, being evil leads to material failure. If you take the contrapositive (not having material success implies that you are not virtuous), it becomes clear that it’s also the prosperity gospel.
There are two big problems with this approach. The first one, straightforwardly, is that it doesn’t match the experience we actually tend to have making moral choices. Nobody gives us a cupcake because we let somebody in on the highway. People aren’t dumb. When you give somebody a game to play, they form a mental model of the system it offers them, and if a piece of that system is obviously flimsy or ridiculous, they’re going to notice, and they’re going to disengage.
The second is that you’re replacing an intrinsic reward with an extrinsic reward. Here is a picture of my dog, Welly:
I hope you will agree that my dog is adorable, because it is objectively the case. What are the odds that you would kick my dog? If you are much like me, I suspect that they are low.
Now let’s suppose I say to you, “I will give you ten dollars if you make it an hour without kicking my dog.”
If you are much like me, your first reaction might be: “Why would you offer me ten dollars not to kick your dog? I wasn’t going to kick your dog in the first place.
“…what, exactly, is the problem with your dog?”
Originally you were avoiding kicking my dog because doing so would be the act of a heartless monster. Now you’re avoiding doing it because you won’t get $10. This is not a trade up from Welly’s perspective. Just like with Bioshock, we’ve taken a moral imperative and turned it into an economic transaction.
Let’s suppose that ten minutes into this hour you are distracted, or walking with pie, or something, and you accidentally kick Welly. Now you won’t get the $10 in any case, no matter how many times you kick him in the next 50 minutes. How motivated are you now to avoid kicking him?
This is called the overjustification effect — attempting to incentivize people doing something they already want to do makes them doubt their original reasons for doing it. If my dog was so unkickably adorable, why would I have to pay you not to kick him? There must be some reason, right? Maybe kicking him isn’t so terrible after all. When the money is removed from the equation, the incentive is gone, but the doubt remains.
The conclusion here is simple — if you want people to take moral choices seriously, you can’t do it by offering them bennies for being good people — they know that’s not how morality works. If anything, you need to incentivize the opposite. Offer them $10 to kick my dog. That way they can find out exactly how much they were willing to give up to do the right thing — or exactly how much it took to get them not to.
So after three days Dog Eat Dog has already reached its first goal! I’m very excited to find such a groundswell of support for this game. I’m already planning a book of scenarios as a stretch reward, and I hope fervently that I’ll get a chance to put it together. I’ve already got some great guest authors — Elizabeth and Shreyas Sampat, Dev Purkasthaya, Mark Truman — and I’m just waiting to hear back from a few more. Check it out! In the meantime, though, I thought I might add some actual content to this blog.
So why make Dog Eat Dog a game instead of a book, or a short story, or a movie?
The thing about a book, to my mind, is that it offers a discrete narrative, connected to the reader only insofar as the reader actively draws parallels to their own narrative. In McLuhan‘s terms, it’s a hot medium, unwelcoming to audience participation — by the time it reaches you, it has already been defined.
A game only exists in play. It’s a social interaction, drawing its strength from the existing ideas of, and relationships between, the players; but it’s also formalized — Turner’s liminal rite — and offers the players a structure to direct their creative energy. A properly crafted game, I believe, will allow people to arrive at the same conclusions they might have gathered from a sufficiently careful perusal of your book — but because they came to those conclusions themselves, as a consequence of a narrative they themselves participated in, they will embrace them much more fervently.
It’s not sufficient just to have a structure, though — your midbrain isn’t interested in feeling, it’s interested in advantage. If you want to create an effective game, you need to start up people’s dopamine engines by offering them a system with mechanical rigor, one that is susceptible to the crude economic analysis of the substantia nigra. I’ve seen games that say things like, “The players should reveal the hidden fears and frustrations of their characters.” Arguments to the contrary, this is still a game — but it’s an improvisational game. That’s not a rule, it’s a prompt. Here’s Tina Fey on improv:
The Harvard guys keep the Improvisers from wallowing in schmaltz. (Steve Higgins used to joke that every Second City sketch ended with sentimental music and someone saying, “I love you, Dad.”) (Bossypants, p. 125)
Tina Fey is an accomplished improviser and writer, and she’s making a specific point here — improv is a lousy way to produce meaningful emotional involvement. If it were easy to produce compelling narrative through improvisation, we would all be professional actors. In reality even most professional actors aren’t professional actors, and if you want compelling narrative, you’re best off using the same things that produce compelling narratives in real life — economic incentives.
There’s a corollary to this, which I will expand on next post.