The history of videogames maps directly onto the history of computation. At least, that’s how speakers cast it at GDC this year. Chelsea Howe, Chris Crawford, Dave Jones, Graeme Devine, Ken Lobb, Lori Cole, Luke Muscat, Palmer Luckey, Phil Harrison, Raph Koster, Seth Killian, and Tim Schafer (phew) each talked about one aspect of videogame history in which they were personally involved. The keynote was both an homage to GDC, the event, and to GDC’s prime mover, that repugnant, beautiful monstrosity known as ‘the videogame industry’.
At the 30th iteration of an event that has become one focal point for videogames writ large, the stakes for writing the history of the field seem almost too high even to attempt. Representing all the people who made the moves, deals, designs, and mistakes that made videogames what they are today isn’t something one does in an hour or two. And perhaps it’s truly impossible to do in a cavernous ballroom in San Francisco. Then again, telling this story right next to where Apple announces the iPhone each year seems queasily right considering the history that the speakers settled on.
The talk began with what was seemed like a leitmotif, but then became a theme before finally unveiling itself as would-be history. It’s about technology, and the fact that chips that make your computer, well, compute have gotten smaller, faster, cheaper, and better, year after year after year. You’ve probably heard of Moore’s Law. It has been one of the tech industry’s waypoints / self-fulfilling prophesies for most of the consumer computing era.
This idea emerged innocently enough. 30 years ago, we used old computers to make state-of-the-art games; now we use new ones. New Macs have 5,000 times the transistors, 86 times the clock speed, and 2,000 times the memory of that laughable (and people did laugh) 30-year-old Mac that some people in this cavernous room used to use to make games. This account trades on the same winking nostalgia we might feel looking back at that scene from Friends where Chandler tells everyone about his “bad boy” of a laptop by rattling off its now-feeble specs.
Many of the speakers at GDC used this move to turn and return to a narrative of technological empowerment that goes something like this: The present is great because computation is the smallest, fastest, cheapest, and best it’s ever been. And the future looks bright, over there there in the hazy not-now! That’s the place where the chips are even smaller, faster, cheaper, and better than they are today.
It’s a seductive argument, in part because it has the advantage of being simply true. Games look, sound, and feel different today than they did when GDC began in the 80s, and technology affected that. But histories like this that are reliably true also have a certain disingenuous miasma about them. Sure, they might be true, but that often means they’re also merely true.
After all, the argument for reading technology as progress itself it isn’t made persuasive by how amazing the technology of the present seems. Current technology usually seems slightly shitty, even or especially when it’s “amazing.” Instead of focusing on the achievements of the present, the argument for technology-as-progress persuades by the promise of tomorrow and its distance from yesterday. If you fully buy in, today is always already the best of all possible days. (Remember how terrible yesterday was?) And tomorrow will be even better. It’s a violently optimistic way of doing history, but a great way of doing business. The near-future technological promise is the economic ground on which the Bay Area rests. You’ll note the presence of the San Andreas Fault.
I always think about America’s favorite piece of Concord pond scum, Henry David Thoreau, at moments like this. Thoreau emphasizes that nothing in the world of the mid-19th century United States really needed to run at 30 miles per hour. “We do not ride upon the railroad;” he writes in Walden (1854), “it rides upon us.”
So, yes, as technology changes, new things become possible, from living west of the Mississippi to living in The Witcher 3‘s (2015) expansive kingdoms. But our expectations and assumptions catch up with the technology. The railroad rides upon us. And the best of all possible tomorrows never arrives as promised. Google Fiber will seem like Chandler’s 56k modem before long. But is life better with Fiber than it was with 56k? That’s a serious question, and one these histories recklessly, disingenuously didn’t ask.
I find it strange that this is the story—naive, earnest, predicated on a stable notion of progress—that luminaries in the game industry decided to tell about their lives’ work. Saying that the history of videogames is the history of technological improvement is tantamount to saying that the history of literature is the history of writing implements, from cuneiform to Shakespeare’s quill to David Foster Wallace’s Microsoft Word. The quill didn’t write King Lear; Word didn’t write Infinite Jest (even if it helped with the footnotes). Such a history mistakes tools for art.
Check out our ongoing coverage of GDC 2016 here.