Essay Games Steven Johnson
The villagers are restless. I think of them often through the day, during the quiet moments, as I'm walking back from dropping my son off at school, or waiting in line for coffee. I am thinking about their needs: some of them are hungry, and require fresh supplies of grain; some have grown tired of sleeping without a roof over their heads.
I'm thinking, too, of their ambitions: to someday build a great temple in their midst, or a broad stone wall around the village centre to protect it from invading hordes. I am thinking of their demographics: right now, there aren't enough children to sustain the kind of population growth I have envisaged for them. And I am thinking of their devotion - to me. Because I am their god.
I should say at the outset that these are the kinds of thoughts that make me relieved there aren't mind-readers standing next to me at Starbucks. There is something slightly mad trying to figure out how to fortify your worshippers against invaders while simultaneously trying to decide between having a venti or a grande Frappuccino. But it is a madness that any regular videogamer will instantly recognise. Because, of course, my worshippers are virtual ones.
I have been playing Black & White 2, the new game from acclaimed designer Peter Molyneux, in which the player is cast in the role of a deity. The goal, ultimately, is to build your flock: converting whole cities into believers, either through wrath or beneficence, by conquering or clothing them. Or usually, in my case at least, a little of both.
If you are not a gamer, you may be surprised how many otherwise normal-looking adults around you are harbouring comparable thoughts. In the US, the game industry now brings in as much money as Hollywood, and the average gamer is roughly 30 years old. We are fast approaching the point where an ordinary gamer is more likely to have had a child than be one. Since Tetris at the very least, and probably all the way back to Pong, digital games have been lodging themselves in the back of our consciousness, prodding us to think through their puzzles just one more time before going to sleep.
I would submit that the primary reason for that mental screenburn is this genuinely unsung fact: today's games are exceptionally difficult. They tax the mind in ways that would amaze anyone who last played a game in the age of Pac-Man. In Black & White, for instance, the player must simultaneously track hundreds of shifting and interconnected variables. Some of these are emotional and metabolic in nature: each worshipper - and there can be thousands of them - has a distinct set of needs you must satisfy or risk losing their devotion. Some are militaristic: other villages, worshipping rival gods, may be building armies to attack your strongholds. Some needs are environmental: build too many villas for your population and you will burn through the supply of forests surrounding your growing town.
Crucially, each of these elements connects with the others: protect your forests by building fewer houses, and your villagers won't reproduce at the same clip, thereby limiting the size of the army you can build.
Black & White is a relatively highbrow game, of course - but only in subject matter, not complexity. There are moral values explicitly addressed by the game: you can choose wrath or kindness, swords or ploughshares, to win over your disciplines: hence the title. But the mental challenges involved in Black & White are positively routine.
The best-selling PC game of all time, The Sims, involves an equally complex tableau of variables to track, while the ever-popular sports simulations force you to run an entire organisation - making trades, balancing budgets, soothing egos, as well as calling the plays from the sidelines. Even the controversial hit game Grand Theft Auto maps a staggeringly large and complex world: one players' guide to all the variables involved in the game clocked in at 53,000 words, the length of a short book.
But does this complexity, on its own, necessarily mean we should take games seriously as works of culture? That they should be reviewed and dissected alongside books, film and ballet? After all, crossword puzzles are mentally challenging, but we don't generally run reviews of them in the culture pages.
I think the answer to that question is a decisive yes, but doing so requires that we develop new aesthetic criteria that are appropriate to the medium. Many games take the player through some kind of narrative arc, but I think, in general, storytelling is one of the least interesting things about gaming. Where psychological depth is concerned, most games are laughably simple. The great majority of gamers, I suspect, don't engage with games because they want to find out what happens, or because they care about the characters. They engage because they want to figure out how the system of the game works, or because they want to explore the space the game represents.
Banal narratives and one-dimensional characters sounds like a critique, but only if you are starting with the criteria we use for novels or films. But if you think about games as closer to architecture or environmental art, then it doesn't seem like such a failing. We don't look down on buildings because they don't have strong narrative threads or well-developed characters. The same should be true of games. They are - first and foremost - environments and systems, not stories. The art of making a great game lies in making spaces that are interesting to explore, and systems that are interesting to tinker with - like those teeming villagers in Black & White, with their multiple, interconnected needs.
Game of life
But if games tend to lack storytelling prowess, it doesn't necessarily follow that they lack social relevance. All the complex simulation games on the market - from The Sims, to Civilization, to SimCity, to Black & White - are, in effect, animated theories of how a given society works, whether it is ancient Rome or a modern metropolis. You learn the theory by playing. One of the defining attributes of Grand Theft Auto that has been chronically ignored by critics is how explicitly the game plays as a satire of American inner-city culture - or, more precisely, suburban America's nightmare of inner- city culture. But that satire emerges as much out of the environment of the game - the hilarious radio pseudo-soundtrack, the snippets of dialogue you overhear in the world - as it does from the story that unfolds as you play.
All of this - the economic strength of the gaming industry, the complexity of the games themselves, and their growing relevance as a platform for social commentary - adds up to one inevitable conclusion: ignoring games means ignoring one of the most interesting and innovative cultural forms of our time - not unlike writing off Hollywood in the era of Citizen Kane and Gilda.
In fact, there is more to be said about the connection between early film criticism and the contemporary assessment of gaming. But it will have to wait for another day. I have a flock to tend to right now.
· Steven Johnson is distinguished writer in residence at NYU's department of journalism, and the author of Everything Bad is Good for You, published by Allen Lane. To order a copy for £10 with free UK p&p, call the Guardian book service on 0870 836 0875 or go to theguardian.com/bookshop
· If you'd like to comment on any aspect of Technology Guardian, send your emails to email@example.com
Of all the new experiences that Silicon Valley has given us in the past few years, I suspect the most emblematic — for better and for worse — goes something like this: You’re standing on the street corner in a busy metropolitan area, and you reach into your pocket and pull out a supercomputer the size of a pack of cigarettes; using a touch interface, you load a map showing your exact location and virtually hail a vehicle that takes you to your destination, all without ever exchanging cash or swiping a credit card. We can have a reasonable debate over whether these breakthroughs were as significant as those from other technological revolutions in the past, but just about every step in this scenario would have been astonishing 15 years ago: the computing device, the user interface, the location technology, the payment mechanisms.
If you read Paul Graham’s essay “Economic Inequality” — or just tune in to some of the standard Silicon Valley mythology — you might reasonably assume that we owe these miracles of modern technology to the dynamic and disruptive forces of startup culture. Uber, Apple, Google, and Android all began as startups, two of them literally in garages. And you might also reasonably assume, as Graham argues, that one of the necessary byproducts of creating those miracles is an increase in economic inequality, given that the founders of those startups and their key innovators were amply compensated for their intellectual labor. Steve Jobs, Jony Ive, Larry Page, Sergey Brin, Travis Kalanick—these people didn’t just accumulate wealth inventing these technologies; they accumulated dynastic wealth, wealth that could keep their descendants in the one percent for centuries. The “exponential curve” of technological acceleration inevitably carries with it an upward curve of oversized economic reward, according to Graham. If you try to reign in the acceleration of inequality, you’ll reign in the technological acceleration, or more likely, the tech acceleration will move elsewhere.
When I first read “On Inequality” a few weeks ago, I found myself irritated by the hint of extortion in the way Graham phrased his argument: That’s a nice 50,000-year-curve of technological progress you got there; would be a shame if something happened to it. But the more I thought about it, the more I found myself considering all the forces that Graham left out of his startup-centric account of technological progress. Yes, hailing an Uber with your smartphone relies on innovations that made a small group of startup founders extremely wealthy. But think of all the other innovations that also make that experience possible, and the different economic models behind them. The Android operating system is a fork of the open-source operating system Linux, which was collectively authored by thousands of people all over the world, with no traditional ownership model for their creation. An iPhone contains many lines of code taken from open source platforms maintained by nonprofit working groups. The Web and TCP/IP protocols that allow the device to communicate with servers at Uber were developed by Tim Berners-Lee at CERN and by a handful of computer scientists around the world, many of them partially funded by the United States government. The network of GPS satellites that allow you to pinpoint your location on a map were initially created by the U.S. military. The atomic clocks that make GPS work were first built by national laboratories in the United States and the United Kingdom. Cellular networks were originally invented at Bell Labs, a research lab inside a giant corporation, whose innovations were effectively socialized thanks to the anti-trust agreement Bell, and then AT&T, struck to preserve its monopoly.
Those of us lucky enough to live in an age and a society where we can casually pull out an iPhone and grab an Uber are indeed sitting on top of an extraordinary technological curve. But startups are only part of the story of how we got here. In fact, it is one of the strange ironies of modern technology that the more conventionally high-tech the product, the lower the percentage of pure unfettered capitalism you will find in its DNA. From the sweatshops and rubber factories to the Nike store, your sneakers are pretty much market economics from the first stitch, all the way down the chain. But your iPhone isn’t just made up of startup brainstorming and Foxconn labor; it also has in its makeup open source networks, and academic research, and military funding, and government-subsidized science labs, and whatever strange hybrid of socialism and monopoly capitalism Bell Labs was. Some of those systems increased inequality by making their founders rich; some of them decreased inequality by making a valuable resource free. Indeed you could make a strong case that the most important innovations that drove the triumph of Silicon Valley did not emerge inside traditional private corporations at all.
Graham’s essay struck a particularly dissonant chord with me because I have spent quite a bit of my time over the last few years defending Silicon Valley against what I consider to be an inaccurate perception of the place as a kind of libertarian rogue state, driven by the mania of wealth accumulation — a kind of West Coast mirror of Wall Street’s hyper-capitalism, Salomon Brothers in sandals. Those stereotypes, I’ve argued, fundamentally misunderstand precisely what is so interesting about the Silicon Valley: that it is both an apex of modern capitalism and one of the most politically progressive sectors in the country; that it celebrates entrepreneurial energy but also has a passion for radically different economic models; that it rewards both garage startups and open source-style projects like Wikipedia; and that its long history of building on government-funded innovations makes it much more appreciative of the role of the state in driving progress than most business sectors. If Silicon Valley were really the Ayn Rand fortress that it is rumored to be, then how did Obama manage to win Santa Clara country in 2012 by 42 percent, more than 10 times his national margin of victory; how is it that more than 90 percent of the political donations from inside of Google, Apple, and Ebay went to the Obama campaign? It’s the tech sector—particularly the NYC scene, with companies like Kickstarter and Etsy — that is pushing for new organizational forms like Benefit Corporations, which create legal structures that allow founders to prioritize social values (like fighting climate change or inequality) as much as shareholder value. It’s the tech sector that is driving the argument for Guaranteed Basic Income and encouraging aggressive public-private partnerships to fight climate change. There are plenty of things to complain about in the culture of Silicon Valley (and other hotbeds of tech innovation) but naive, greed-is-good libertarianism is not one of them.
I know all these things are both true and important, but I also know that from now on when I make these arguments, someone who only knows Silicon Valley from afar will shrug and say, “I don’t know — didn’t you read that Paul Graham essay where he said if we tried to reduce inequality, all the startups will leave and technological progress will grind to a halt? That’s the way they really think out there.”
Graham does leave room for the idea that some forms of wealth accumulation might be productively checked, in cases of outright fraud, or zero-sum exchanges in the financial sector where one person’s gain is another person’s loss. But most inequality — particularly in startup culture — is non-zero-sum, he argues. One person’s gain is just that: pure gain, no loss.
In the real world you can create wealth as well as taking it from others. A woodworker creates wealth. He makes a chair, and you willingly give him money in return for it. A high-frequency trader does not. He makes a dollar only when someone on the other end of a trade loses a dollar. If the rich people in a society got that way by taking wealth from the poor, then you have the degenerate case of economic inequality where the cause of poverty is the same as the cause of wealth. But instances of inequality don’t have to be instances of the degenerate case. If one woodworker makes 5 chairs and another makes none, the second woodworker will have less money, but not because anyone took anything from him.
Others have correctly pointed out the zero-sum logic applies to many modern economic trends beyond high-frequency traders. Over the past 40 years, the average pay ratio — the gap between the highest and median employee — inside an American corporation has increased from 30:1 to more than 300:1. This is the very definition of a zero-sum game. There is a finite amount of compensation available for the employees of a company. Two generations ago, the CEO would get a sliver of that pie; now he or she gets to scarf down three slices or more. Whether the overall pie has grown or not over that period doesn’t matter, because the additional pie has all gone to fatten the executives.
If you want to defend Silicon Valley in a discussion about inequality, the way to do it is not to reject out of hand any effort to flatten the rewards of tech innovation. The way to do it is to point out that Silicon Valley has a much more sensible pay ratio than most other industries in the United States — closer to 40:1, not so far from the ratio that predominated in the age of Big Labor. A big part of that egalitarian track record revolves around equity participation. I wrote about this at length in another post a few years ago:
The top 100 tech companies granted 19% of their total ownership to non-senior-executive employees (i.e., everyone excluding the CEO and four lieutenants). For the rest of corporate America, that number was 2%. In other words, when it came time to share rewards with ordinary employees, the Tech 100 were ten times more generous than low-tech firms. This is actually one of the hidden strengths of the tech sector in the US: its companies are much more competitive precisely because they are much more egalitarian in how they share their wealth internally. I would be surprised if there were any new industry in the history of capitalism that distributed its economic rewards to its employees as widely as Silicon Valley has. Billionaire founders or CEOs are nothing new. But multi-millionaire middle-managers?
In other words, the tech sector doesn’t have to be the poster child of inequality’s abuses. It could actually be a role model. Take just one potential remedy as a thought experiment. Let’s say we decided as a society that no private company should have a pay ratio above 40:1. That would lead to a radical decrease in income inequality, and it wouldn’t involve a cent of additional taxes. Every private company would be allowed to keep the exact same portion of its income. The government wouldn’t be extracting money out of the private sector; it would just put some boundaries on the way the private sector distributes its money internally. Critics would scream that such a dramatic intervention would be terrible for business, but of course the one sector of the economy that has already voluntarily embraced this ratio turns out to have nurtured the most profitable corporations in the history of capitalism. This would no doubt be fiddling with the natural markets for wages, but we fiddle with these all the time, through progressive income taxes, earned income tax credits, subsidies, and tax incentives. We have a minimum wage. What if we had a maximum ratio?
What would be the point compressing the range of potential incomes so dramatically? Is this just about some nebulous concept of being “fair”? For Graham, if my dynastic wealth doesn’t take anything anyway from your middle-class comforts, there’s no point in trying to reduce my wealth, given the value of the innovations bequeathed to society in pursuit of that fortune. Fighting inequality with some kind of ceiling on wealth creation might sound fair, but it forces you, in Graham’s words, to “design your society in a way that’s incompatible with … one of the most powerful forces in history.”
But even in the tech sector there are zero-sum games that must be accounted for: most importantly, the zero-sum game of attention and passion. There is a finite pie of human intellectual talent available to us at any given moment — the pie is growing, but it’s finite nonetheless — and how that attention and passion gets directed has a dramatic impact on the slope of the progress curve. Every market sends out signals encouraging certain kinds of problem-solving, certain kinds of skills. Nine years ago, the loudest signal was telling college students to become experts in credit default swaps and collateralized debt obligations. I think we can all agree on how that worked out. When markets send out signals this conspicuous, attention and interest are like a giant field of sunflowers; they shift, slowly but inexorably, toward the brightest light. Through that shift, other, fainter stars lose their followings. That is zero-sum gameplay, too.
Right now the tech market, even with its admirable pay ratios, is signaling to the world that inventing a new app for teenagers to flirt and banter can be thousands of times more valuable than becoming a high-school principal in a troubled district. That is a ratio with real costs to society. And I say that as a believer in these technologies! I think the world is a better place, on the whole, because Facebook, and Twitter, and Google were invented. If it’s a choice between derivatives traders and Silicon Valley startups, I will take the startups every day of the week. But for every Facebook coming out of Silicon Valley there are a hundred juvenile apps and services that are solving trivial problems, or just wasting people’s time. The kinds of problems that can be fixed by tech startups are not always the most important problems that a society confronts. When a talented young woman graduating from a good school looks out at the landscape of potential economic reward and sees a giant, glittering billboard promoting the Silicon Valley or Wall Street lottery, while all the less lucrative callings lurk at the peripheries holding hand-lettered signs, she’s inevitably going to be propelled toward the light. There is loss and gain in that exchange, too.
It does seem suspicious that an industry so ruthlessly devoted to disrupting the status quo would draw the line at disrupting a market that considers inventing an iPhone game a thousand times more valuable than becoming a city planner or supervising a medical clinic. Reading “Economic Inequality,” you might reasonably conclude that the tech sector’s default attitude is: Everyone else’s status quo is fair game for disruption; just don’t mess with the status quo that may someday let me buy a mega-yacht. Maybe Graham is right; maybe all those open source and government-funded tech triumphs belong to the past, and the only economic model that guarantees continuous innovation now is the one where the winners walk away with billions. But isn’t it worth investigating alternatives? Isn’t the defining ethos of the tech sector its willingness to challenge the existing models, experiment with new ways of doing things? The good news is that Silicon Valley happens to be much more committed to exploring new and more equitable models of compensation than just about any other industry in the country. The bad news is that Paul Graham has made that commitment much harder to see.