The visions of an apocalypse peddled by nihilistic humanists and misanthropic transhumanists confuse the destruction of humankind with salvation.
Fair game
350 million do it regularly. It offers levels of complexity and human interaction beyond any other art form. We can’t continue to ignore the cultural impact of online gaming, says Michael Bywater.
Are you Timmy, Johnny or Spike? Timmy likes movies with car chases, explosions, guns, violent conquest and mayhem. Johnny is a more cerebral type, an aesthete with ironic spectacles and a gentle soul, happiest chatting about Zizek and Jean Renoir in the coffee-bar of the Curzon Soho. Spike wants competition, hand-to-hand, the bad guys to triumph: Darth Maul over Obi Wan Kenobi every time.
Most of us would, if pushed, identify ourselves with one of the three. But how many of us would cavil when we learn that Timmy, Johnny and Spike aren’t the creation of film marketeers but of video game designers; these are names used to define the three archetypical players. Many would find the very idea that they can be defined in these terms an insult. And why? Because video games aren’t included in conventional definitions of culture.
Which, according to Tom Chatfield’s Fun, Inc., published this month, is a serious mistake. If, like many – perhaps most – parents, you spent Christmas exasperated by detached, unblinking children glued to their X-Box or Nintendo DS, you may instinctively feel that Chatfield must be an apologist for the shallow, mechanistic and potentially addictive Zeitgeist. That would be a mistake. He’s a literary editor with an Oxford D Phil in post-religious rhetoric in English poetry, not a credential you’d spend three years on if your ambition was to try to legitimize unsocialised teenagers spending all night shedding virtual blood in massively multi-player dungeon games to the detriment of their intellects, their lives and their complexions.
Chatfield’s arguments are unexceptionable: video games are a hugely popular pastime. They are a crucial part of the interior landscape of most (western) people born after 1980, and of many born before that. They are much more complex and subtle than the media would have us believe. They do not reward violence – often, mayhem is an initial thrill, to be followed by a more thoughtful and, in the case of multi-player games, co-operative approach which leads to actual progress. And, of course, they are big business, with budgets approaching or exceeding those of Hollywood movies: Grand Theft Auto IV, the most successful – and controversial – console game of all time, had a $100 million budget, a production team of over 1,500, and grossed $310 million on its first day of release, some 50 per cent more than Spiderman – The Movie and around four times more than James Cameron’s Avatar (on track to be the most successful movie of all time).
Yet within “the culture”, games are widely treated as a special case, accorded tentative and provisional membership of the culture club. They are intermittently covered in the press but, with a few exceptions like the Guardian’s Charlie Brooker, either presented as scare-stories of moral decay or kept in a ghetto for the special-interest “minority”.
It’s odd that this should be so. The “culture” can no longer legitimately be seen as something high, canonical or circumscribed. It is, as Wittgenstein defined the world, “everything that is the case”. Nor can it simply be nerves about supposed ill-effects of video games on those who enjoy them. Football and alcohol can hardly be separated; rugby is ritualised violence; boxing, two men punching the hell out of each other; yet these are legitimated in a way video games are not.
But why should this be so? Is it that the mainstream media are in the hands of old farts who don’t know what video games are and have never played them? Or is it because we – they – have an ignorant fear and idle dislike of the very idea of the digital “world” which doesn’t exist in the case of novels, films or television?
Video games aren’t like films or books. You can’t dip in. You can’t buy your ticket or your DVD and schlub down in front of the screen to be fed. You have to invest, sometimes several hundred hours. And of course, having made your initial investment, you’re tempted to keep playing.
That investment, from the outside, can look aloof and disengaged. The thumbs and fingers twitch on the controls. The head nods in time to inaudible music. The eyes are glazed. The player doesn’t respond to stimuli. But what does someone watching TV look like? Someone engrossed in a book? Complete involvement in any activity is seldom uplifting to the observer. Would you want your friends to see you having sex?
Yet we single out the digital world – especially the online worlds, where groups of players wander, explore, conquer, form alliances, accumulate wealth, plot strategies and simply sit around and chat in all-engrossing universes like World of Warcraft or Second Life, a plotless worldscape where players build alternative lives to their “real” world existences.
These, as Chatfield says, are in many ways reality simulators, and offer models of complexity and intricacy that allow us – both the involved participants, and, in some cases, observing economists or epidemiologists – to try out strategies, tactics, and ways of being.
So what are we – what are the media, the church leaders (the same kind who want to ban Harry Potter for Satanism and Philip Pullman for atheism), parents and pundits – so worried about? It’s as if we want to draw a cordon sanitaire around everything that involves digital processing, to exclude the on-screen world from the world of culture.
So Helen Rumbelow, writing over Christmas in the Times, reviewing a TV programme dedicated to video gaming called Games Britannia, writes scathingly of “Men [who] fiddle with joysticks… a retreat from real life… almost entirely male preoccupation… vacuous… pale youth joylessly clutching his joystick in front of his video game.” So, too, Baroness Greenfield speaks of some dopamine-mediated addictiveness brought on by the alleged results-driven engagement of video games, which is denuding our identity and, as she wrote in one particularly bizarre piece for Wired, might be responsible for the credit crunch (because bankers grew up on video games). Her “chilling warnings” have been enthusiastically picked up by the Daily Mail as yet more evidence of our broken society.
Perhaps at the heart of this dismissal lies a category mistake: the belief that because a computer is a digital instrument, which, in its internal workings, knows nothing but 1 and 0, so everything done on a computer must somehow (impossibly) inherit the same restrictions.
That view is based on the hair-tearing frustrations we all encounter when we become lost in what sociologist Michalis Lianos calls “Automated Socio-Technical Environments (ASTEs)” – outsourced Bangalore call centres where Indians with PhDs are used as human printers, delivering a fixed script; online credit-checkers and automated check-in machines, barriers and ticket dispensers and all those digital analogues of the old-fashioned commissionaire, all authority without power or discretion, which have come to characterise too much of our world. The fundamental point of all these systems, writes Lianos, “is that the user cannot negotiate with the system. What is involved here is a transformation of culture so radical that it amounts to denial. Negotiation is the prime constituent of culture.”
If we assume that the non-negotiability of computer (anti-)culture applies to worlds made using computers, then clearly anyone who wishes to immerse himself in such a world is either mad or in some sort of flight from reality.
But video games are anything but non-negotiable. The “platform games” like Super Mario or the grandaddy, Pac-Man, are as negotiable as a game of squash. Second Life is almost nothing but negotiation as you find a career, make friends, get somewhere to live, buy stuff. World of Warcraft in the end favours those who negotiate alliances and join forces – because, after all, it is like a colossal telephone network; the players on the other end, on the million other ends, are people too. We cannot dehumanise them, or dehumanise the experience of the game, just because we don’t like the technology used to make it. We spend much of our lives submerged in greater or lesser unrealities. The phone. Our money. The Higgs Boson. The neurological (and currently impenetrable) real-time model which is our experience of living.
Still, the most common public position, outside gamers, and the few universities running degrees in “game studies”, is that video games are an unfortunate fact of modern life, and the people who play them are losers, berks and tossers with spots, and the best thing we can do is try to minimise the risk and hope they’ll grow out of it.
In the 1980s, when Chatfield was getting himself born, I worked as a writer on a number of computer games. In some ways it was like working in film: a combination of feeling your way in new storytelling modes, and technological liberations and constraints. The medium has developed, just as filmmakers evolved a grammar and new techniques like moving the camera, panning, zooming, dollying and pulling focus. It can encompass minimalism – play the tiny, lapidary Passage (Google it, or get it from the iPhone App Store) and see if you aren’t moved – and grandeur, like Sword of the New World. It can be reflective and absorbing, like Myst and its sequels. If there is a Tolkienian, pseudo-Mabinogion undertone to many games, that might be a question of heritage (the clever people who created the first games were building on the mechanics of Dungeons & Dragons, ideally suited to the new technology), or simply of a curious geek fascination with closed worlds and coherent, if simplistic, backstories.
It was a minority interest then, though a fast-growing one; but it is not now. Cultural commentators who sideline video games aren’t doing their job and have no more reason on their side than the Victorian mamas who declared that novel-reading led to sexual incontinence and – like Baroness Greenfield now – to vitiation of the brain.
One problem is the paucity of critical tools suited to analysing games. Our conventional ones derive from the idea of an explicit narrative, which video games don’t – can’t – have. A social fear comes with that, too: the person playing a video game is in a world of their own making instead of sitting passively being fed the story. But so is a novel-reader; and the idea of the novel as a manufactured commercial product, an original and self-contained extensive narrative, is as far from the roots of storytelling as World of Warcraft is from the Iliad. Video games’ claim to be a potent cultural force needs no formal submission. They are there; they are varied; they can be sophisticated or vulgar, simplistic or intricately intertextual. Over 350 million people play video games worldwide. Are they all spotty losers?
The vital thing, missed by those who see nothing but mindless violence or reflexive, antisocial idiocy, is that the gamer is, no less than the reader or the chess-player, not interacting with a computer-screen (or pieces of paper, or carved wooden figurines) but with an imagined worldscape created by immersion, focus and mindfulness. Let’s accord this new medium the place it has already claimed in our culture. What on earth is there to be afraid of?
Published 4 February 2010
Original in English
First published by New Humanist 1/2010
Contributed by New Humanist © Michael Bywater / New Humanist / Eurozine
PDF/PRINTNewsletter
Subscribe to know what’s worth thinking about.
Related Articles
Alternative AI
springerin 1/2024
Artificial intelligence beyond good and evil: connectionist vs. symbolist logic and how to programme creativity; why the algorithmic gaze reproduces neocolonialism in the Middle East; technophobia and what sci fi gets wrong about robots.