Stuart Moulthrop (re)mediates the interpretation (narrativists) vs. configuration (ludologists) debate by going macropolitical.
This essay got its start in my keynote remarks for the Digital Arts and Culture Conference in the Spring of 2001 and took its present form during the Summer and Fall of that unforgettable year. Those were not easy months in which to think about changes in media and culture, however momentous. With the implosion of the Internet bubble, the future that had recently looked so glorious tumbled suddenly from promise to delusion. A popular song from those days sardonically recalls "the time when new media / was the big idea," suggesting that even nostalgia had come to work on Internet time. Or had that sort of time run out? As the digital generation's ecstatic orbit swung through the dark side of the business cycle, some wondered if the abrupt end of the boom might send the American economy into the same dismal straits as the Japanese (Krugman 2001). To make matters vastly worse, September saw the infamous terror attacks that killed thousands in the U.S. and touched off a conflict of indefinite scope and duration. Much indeed has changed.
With troops on the ground and war talk in the air it has become difficult to give much space to art, literature, and other forms of cultural production, let alone their critical controversies. The going gets even harder if one's concerns run less to the polite than the popular, not to novels or plays but to games and simulations. The subject of play is inherently troublesome in a postindustrial or neo-Taylorist regime, even without economic troubles or terrorist threats. Yet games and play demand serious attention even in such times as these, and perhaps especially now. As Donna Haraway observed in 1985, "we are living through a movement from an organic, industrial society to a polymorphous, information system -- from all work to all play, a deadly game" (Haraway 1991, 161). Some might describe this shift as decline or decadence, a falling away from moral certitude into confusion or relativism -- a weakness hardly to be tolerated in wartime. But such dismissals obscure the primary significance of the change.
As people explore the affordances of digital communications networks, social institutions and practices take on different characters, moving from what Pierre Lévy calls the "molar" or mass form of industrial society toward a more intimately networked "molecular" state (Lévy 1997, 40-42). This transition affects all modes and sectors of the social, from commerce and politics to language and culture; and it continues despite the exigencies of geopolitics. The developments discussed in this chapter -- a shift from narrative to ludic engagement with texts and from interpretation to configuration as a dominant approach to information systems -- are in fact inevitably implicated in the current upheavals of society. This is true enough in the general sense that terror victimizes whole nations; also in the sense that the world's current divisions reflect what Haraway calls an "informatics of domination" (Haraway 1991, 161) that increasingly polarizes the liberal and the fundamentalist. However, there may also be more specific links between the emerging culture of serious play and the crises of the new century.
In the turn from consumption to participation, from interpretative to configurative practices, we find ourselves in a new relationship to media. Since configuration requires active awareness of systems and their structures of control, this turn allows us to resist the assertion of invisibility or transparency in communications systems -- a danger that seems particularly pronounced in these new wars of the 24-hour news cycle. It may happen that in refusing the transparency of media we make ourselves better able to interrogate the nature of the conflict, perhaps even to understand more clearly what we mean when we talk about war and other deadly games. This is to cast the discussion in very broad terms, however, taking up large social concerns, including what in an age of writing came to be called literacy. That term may no longer apply without significant alteration, but it still seems true that a general understanding of media, the prevailing logic of production and reception in our modes of communication, is conditioned by local instances and practices. Or to put this in Lévy's terms, the emergence of molecular society involves scalar similarities: what holds in large holds also in little. Thus even apparently parochial and academic controversies, such as those to which we now turn, may reflect a more significant process.
Let the Games be Games
There is a consensus for change growing among some of those concerned with creative work in digital media. This sentiment can be seen increasingly in interactive art pieces, conference papers, manifesti, web logs, and journals such as the newly launched Game Studies. Like all so-called movements, the partisans of digital game theory or ludology comprise a loose and fractious community, but most share at least one premise. We feel that narrative in certain conventional senses -- mainly defined by the theater, the novel, and cinema -- no longer animates the work we find most interesting as creators and/or critics. Some will insist we reached the turning point long ago, at a time when people such as Jay Bolter, Michael Joyce, George Landow, and myself were content with primarily literary models. More radical thinkers, such as Espen Aarseth, began to discard such approaches long ago, even before his landmark study of "cybertext." However we date the change, though, we should be able to agree on what Aarseth would call the "ergodics" or pathwork of the moment: we have reached a fork in the road. Beyond this point the traditional narrative interest leads one way, while a second track diverges. We do not yet have a very good name for this other path, though we can associate some concepts with it: play, simulation, and more generally, game.
This parting does not spell the end of electronic literature. Poets, for whom narrative has perhaps always been more affordance than obsession, have turned very playful indeed where the digital interface is concerned, as the work of Talan Memmott ( http://memmott.org/talan/works.html), or Megan Sapnar and Ingrid Ankerson in PoemsThatGo (http://www.poemsthatgo.com) attests. We will no doubt continue to see important projects that call themselves cybertext or hypermedia but retain a deep investment in words, along with a rich sense of symbolism and nuance. Some of these projects may even have major narrative elements, as in the Myst saga or Nick Montfort's interactive fiction, Winchester's Nightmare. Yet it seems clear that some people involved with digital media will be much more strongly drawn to ludic forms, from complex ecological simulations and virtual cosmogonies down to first-person shooting contests. For a few the shift may mark a strong departure from previous work and training. For others it may seem no choice at all, but the only logical response to contemporary media and culture.
In some cases this stance may entail a certain opposition. Markku Eskelinen notes elsewhere in this volume: "If I throw a ball at you I don't expect you to drop it and wait until it starts telling stories." Yet as Eskelinen and others believe, many followers of digital culture do drop the theoretical ball, insisting that so-called interactive forms be engaged in a general project of storytelling. Among Eskelinen's favorite opponents is Janet Murray, whose remarkable insights about digital art come mixed with an oddly antique strain of narrative theory that seems bound to annoy even lapsed poststructuralists. In Murray's view, digital productions, from adventure games to multiuser environments, are all formally deficient because they refuse to sanction singular outcomes. While such texts may provide a type of closure, Murray considers this effect a counterfeit:
...electronic closure occurs when a work's structure, though not its plot, is understood. This closure involves a cognitive activity at one remove from the usual pleasures of hearing a story.... There is no emotional release or perception of fittingness, just a sense of going from the unknown to the known. This is very different from and far less pleasurable than our more traditional expectations of closure, as arising from the plot of the story and marking the end point of an action. (Murray 1997, 174)
The full extent of the debate between ludologists such as Eskelinen and neo-Aristotelians such as Murray has many dimensions, but few seem as definitive as the insistence on catharsis as the proper source of narrative pleasure. Murray's position does seem debatable on theoretical grounds. Murray's critics complain that she shows little interest in either the modernist agenda of formal experiment or the postmodern critique of literary ideology.
Indeed, any theory based on "hearing a story" seems notably backward after the deconstruction of language as presence. Yet, however controversial, Murray's critical apostasy is really a side issue. In the final analysis the objections to Murray by Eskelinen and company are not theoretical but practical.
At least in the kind of narrative Murray champions, the reader's primary "cognitive activity" consists of interpretation. Our ritual release of pity and fear arrives when we fully understand the relationships among characters and the pattern of causes that constitute a plot; or to expand beyond narrative terms, when we grasp the structure of metaphor and memory that informs a lyric, meditation, or confession. Our engagement with the text is driven by the desire to apprehend the structure in its entirety. As Eskelinen points out, we expect readers to study every word of a literary work, but Web surfers, Multi-User Domain dwellers, game players, and others involved with ergodic texts come under no such obligation. Indeed, game play often involves limiting engagement with the work, avoiding irrelevant or distracting details. One observer of digital culture, Steven Johnson, describes "information filtering" as a primary concern of all electronic discourse (1997, 32). Murray seems not to be interested in such strategies, however. Her notion of "electronic closure," the moment at which the reader of an interactive text understands the rationale of its design and likely limits of its productive capacity, reasserts the regime of interpretation. As Murray sees it, configuration serves interpretation.
Yet the story of gaming may not be so simple even when games are mistaken for stories. Murray herself recognizes that those engaged with electronic texts sometimes fail to read for the plot; indeed, sometimes they cease to be readers and turn into players. Hamlet on the Holodeck begins with an account of Captain Janeway from the Star Trek: Voyager television series, who logs many hours in a simulation called Lucy Davenport, a concoction that resembles the novel Jane Eyre (a kind of Jane Away or Eyre in Space). Janeway's character in the simulation serves as governess to the children of a Rochester-like figure, with all the familiar erotic tensions. Yet as Murray notes, Janeway seems less engaged with the cathartic "moral physics" of the plot than with tending to her simulated charges and maintaining the imaginary household. She subverts the closural design of the "holonovel" by sticking perversely to the middle of things, a behavior Murray finds more intriguing than problematic (Murray 1997, 16). This interesting subtlety surfaces at other points in her study as well. In a later section discussing the adventure game Myst, Murray makes a point I would later echo with respect to its sequel, Riven: namely, that the authorized, successful solutions to the game are less interesting than the more numerous losing outcomes (Murray 1997, 142). Murray argues that the game's main charm lies in protracted exploration rather than end-directed questing, a practice I call misadventure (Moulthrop 1999).
These partial recognitions do not much impress the game theorists. From Eskelinen's perspective, neither Murray nor I saw very far into the matter because we thought about games as distorted or perverse narratives, not as cultural forms in their own right. We therefore missed a crucial distinction. In games the primary cognitive activity is not interpretation but configuration, the capacity to transform certain aspects of the virtual environment with potentially significant consequences for the system as a whole. As Eskelinen says, expanding upon Aarseth, "the dominant user function in literature, theater, and film is interpretative, but in games it is the configurative.... in art we might have to configure in order to be able to interpret, whereas in games we have to interpret in order to be able to configure, and proceed from the beginning to the winning or some other situation." As will be apparent, the difference between "winning" and "other" situations requires further scrutiny; but for the moment it is sufficient to recognize that ludologists set aside narrative because they wish to focus on "configurative practice," as Eskelinen calls it. This shift could be profoundly important for the future of digital culture.
In those classic Infocom games of the 1980s, unparsable commands would sometimes elicit something like this from the program: "I'm sorry, you have used the words profoundly important in a way I don't understand. Please try again." As perhaps we should. The claims made here for digital game culture may seem at odds with the state of the art. In the popular mind and marketplace, the terms video game or computer game suggest products such as Mortal Kombat, Tomb Raider, Half-Life, Evil Dead, Quake, Doom, and Unreal. To be sure, the game business has also delivered better fare: the epic adventures of Infocom and Cyan, the attempts to reinvent game culture by Brenda Laurel's Purple Moon, complex entertainments such as Bad Day on the Midway or Grim Fandango, and triumphs of simulation including Black and White and The Sims. By the same token, the best work in older media depends on vast quantities of disposable output which somehow never come to mind when we use the word literature. We can have no serious drama like Copenhagen without a dozen recycled Producers, no Summer of Sam without a handful of Lethal Weapons, no short stack of Annie Proulx absent mounds of Jack Welch.
Nonetheless, a different standard seems to apply where play is involved. For complicated reasons, computer games seem more keenly exposed to cultural critique than most older forms. This has partly to do with every adult's natural phobic response to teenagers, a complex that stems as much from memory and self-contempt as from fear of the other. It may also proceed from industrial culture's deep distrust of configurative practices. Whatever the reason, many people, especially in the United States, find games at least vaguely antisocial. Noting their capacity to debase and desensitize, the mavens of morality particularly deplore violent games, and perhaps with some reason. As Simon Penny points out in this volume, specialists in military training believe that simulated killing quite literally makes the real thing easier.
Murray registers the need "to find substitutes for shooting off a gun that will offer the same immediacy of effect but allow for more complex and engaging story content" (Murray 1997, 147). Eskelinen and the ludologists may disclaim Murray's concern for story, but it seems very hard these days to defend digital gunplay, even in overtly antiterrorist scenarios like Counter-Strike. Less militant offerings come under suspicion as well, for instance when we learn that earlier versions of Microsoft's Flight Simulator could be used to reenact the attacks on the World Trade Center. Perhaps a proper study of play might lead to Murray's pacifist alternative, but just as arguably any account of game culture must begin with what the market offers, including its worst celebrations of carnage. Thus the attempt to find social significance in games requires a certain intellectual courage -- or at least that is one name for it.
What some call courage, others may consider opportunistic chutzpah; what is ludology if not a professional stratagem? Games of all sorts, not just those invented since the microprocessor, have yet to receive careful academic attention. Cinema has had its Eisenstein, de Lauretis, and Deleuze, literature its Derrida, Foucault, and Cixous, but with a few notable exceptions, the study of games as a cultural form has yet to begin. Games thus comprise an untheorized frontier whose blankness seems very attractive for those who would rather set than follow precedent. Ludologists often characterize the relationship of narrative and games in terms of colonization, casting narrative in the role of cultural empire; but such critiques may ricochet, for as western history demonstrates, rebellious colonies are sometimes empires in embryo.
For those of cynical disposition, the turn from narrative to gaming may seem just another power play in the modern academy's Beirut-of-the-mind. This observation may appear crass, but practical matters often do. Much as we try to separate the work of theory from squabbles over cultural funding, faculty salaries, and tenure decisions, political realities must be acknowledged. Declaring the independence of digital game studies from narratology may mean seceding from literature departments, film studies programs, and perhaps even arts faculties. It could mean forging new alliances both inside and outside the academic community. If such speculations seem rash, consider that profits from computer games surpassed those of popular film before the turn of the century. As Espen Aarseth points out in his introduction to the Game Studies journal, game development is a billion-dollar industry with no clear research agenda (Aarseth 2001).
Cynicism has its uses, but also its limits. A billion-dollar industry is as much a cultural as an economic phenomenon. Or to put this another way, what seems merely a professorial turf war may in fact embody a more profound generational conflict. The turn from narrative forms such as plays, novels, and films to ludic forms such as games and simulations marks the emergence of a younger cohort who acquired their orientation to language as much from dynamic systems as from Aristotelian or even modernist genres. Those who find this group's concerns childish, shallow, or improperly pleasurable may need to examine their premises lest they find themselves on the wrong side of O.B. Hardison's "horizon of invisibility" (Hardison 1989, 5). Of this barrier Hardison notes: "Those who have passed through it cannot put their experience into familiar words and images because the languages they have inherited are inadequate to the new worlds they inhabit." What we have here may indeed be a failure to communicate, with much to learn from the breakdown.
Turmoil in the academy often mirrors fundamental social shifts. Games are profitable because they are broadly popular, and this popularity does not depend entirely on simulated violence or other crude wish fulfillments. Just as there are more generic possibilities than first-person shooters, so there is more to game culture than simple aggression. Consider the case of the promotional game produced in the summer of 2001 to publicize Steven Spielberg's A.I. (see sidebar). Consisting of an elaborate system of puzzles distributed over numerous web sites, the game was only tenuously connected to the content of Spielberg's film, and while its main premise was narrative and interpretive (a murder mystery), its primary appeal was procedural. Players had to decipher obscure references and codes, in some cases involving messages buried in the infrastructure of web pages. Many participants in the game found it considerably more interesting than its cinematic pretext. As one put it: "The game is great. The movie is garbage" (Gallagher 2001).
While this development may seem highly salient, it should not be surprising. Games -- computer games in particular -- appeal because they are configurative, offering the chance to manipulate complex systems within continuous loops of intervention, observation, and response. Interest in such activities grows as more people exchange e-mail, surf the World Wide Web, post to newsgroups, build web logs, engage in chat and instant messaging, and trade media files through peer-to-peer networks. As in various sorts of gaming, these are all in some degree configurative practices, involving manipulation of dynamic systems that develop in unpredictable or emergent ways. More importantly, as Aarseth says, they may only be fully understood as active enterprises: in order to know what they are truly about, we must become involved in production or play (Aarseth 2001).
It might be absurd to suggest that all interactive media are species of game, but games do seem to offer a useful way of thinking about such media. Games model or inculcate a crucial set of cognitive practices. In the older cultures of print and broadcasting, the term literacy came to represent the fundamental capacity to process information -- that is, primarily to interpret. It may be possible to expand the concept of literacy to cover digital systems, as in Nancy Kaplan's argument for hypertextual "E-Literacies" (Kaplan 1995). On the other hand, the shift from interpretation to configuration may require something more than revision, perhaps even a fresh conceptual start, as in Greg Ulmer's "electracy" (Memmott 2001). By looking at cyberspace through the lens of game play, scholars may find it easier to resolve this question, modifying or supplementing the old concept of literacy. Or as one particularly astute critic sums up: "The more we see life in terms of systems, the more we need a system-modeling medium to represent it -- and the less we can dismiss such organized rule systems as mere games."
The source of this quotation should give us pause. It comes not from Aarseth, Eskelinen, Jesper Juul, Marie-Laure Ryan, Gonzalo Frasca, or some other master of digital games. It was written by the eminent Aristotelian herself, Janet Murray (Murray 1997, 93).
This confluence of sentiments, if not of doctrine, suggests that ludology and narratology may not be absolutely antithetical. In some respects, Murray seems to value configurative practice quite highly. She defends aesthetics of the "multiform story" against critics who find such work simply incoherent (89); she points out that the computer is an "engine," not a broadcast receiver (72) and holds that the key to future artwork lies in "procedural composition" (275); she argues that interactive design must find "formats" appropriate to digital technologies, rejecting those inherited from earlier media (64). In her work following Hamlet on the Holodeck Murray emphasizes the uniqueness of digital forms, insisting that "[w]e do not need designers who can produce more-attractive interfaces with the same formats of communications. We need designers who can re-think the processes of communication, exploiting the capacity of the digital environment to be more responsive to human needs" (Murray 1999, 4).
We might wonder how Murray can square these quite sophisticated views with an apparently naïve classicism. If we need new formats, why re-impose the traditional architectures of fiction and drama? If digital technologies take us so far from writing, print, and broadcasting, how can we resort to the ostensible simplicity of the Poetics? Eskelinen charges that Murray lacks interest in theoretical insights developed since the Second World War, but it would be unfair to accuse her of simple negligence. In fact Murray's positions represent an American cybernetic pragmatism whose serious engagement with technological realities deserves a measure of respect, even if one takes issue with some of its implications.
Murray adopts her Aristotelianism at least in part from Brenda Laurel, a figure whose theoretical contributions are deeply informed by her experience as a commercial software developer (Laurel 2001). Murray herself designed groundbreaking simulations at MIT, pioneering educational multimedia. Her apparent lack of regard for contemporary theory may reflect a countervailing emphasis on practice which perhaps fulfills George Landow's prediction that emerging media will provide an empirical testing ground for textual theories (Laurel 2001, 11). In other words, Murray's approach to theory comes via application rather than contemplation -- making her, in Aarseth's terms, a legitimate player. She knows better than most that in the so-called new media it is not enough simply to describe or postulate differences: they must be produced in a marketplace of objects as well as ideas. She thus declares that the future of digital creativity depends on popular as well as elite efforts:
...the shape of narrative art and entertainment in the next few decades will be determined by the interplay of these two forces, that is, the more nimble, independent experimenters, who are comfortable with hypertext, procedural thinking, and virtual environments, and the giant conglomerates of the entertainment industry, who have vast resources and an established connection to mass audiences. (Murray 1997, 252)
With a few modifications -- say, striking "narrative" and substituting "cybertext" for "hypertext" -- this forecast might pass among the ludologists without strong objection. Certainly their focus on popular entertainment suggests they might see themselves in Murray's equation of forces, no doubt among the iconoclasts and pioneers. But while this analysis may narrow the division between games and narrative, the separation cannot be entirely erased. At least one salient issue remains. Although Murray and the game theorists might agree that the future depends on a contest of established and emergent interests, they probably do not see the same outcome for this struggle. Indeed they should not, if game studies have any critical value.
Do Not Immerse
Murray's approach to new media seems both culturally and technically conservative; for some indeed this may be its main virtue. Like the design theorists she most admires, Laurel and her mentor Donald Norman, Murray assumes that new media should provide highly efficient, minimally obtrusive tools. She seems to agree with Norman that the best computer is an invisible computer, at least where narrative is concerned:
Eventually all successful storytelling technologies become "transparent": we lose consciousness of the medium and see neither print nor film but only the power of the story itself. If digital art reaches the same level of expressiveness as these older media, we will no longer concern ourselves with how we are receiving the information. We will only think about what truth it has told us about our lives. (Murray 1997, 26)
Murray's claims about "immersive" technologies seem accurate enough. Whether we are considering a fantasized holodeck or an actual computer game, interactive media tend to envelop the player in consistent rule systems, if not virtual realities. Indeed, this immersiveness or "holding power" is a major aspect of game experience, as Sherry Turkle pointed out long ago (Turkle, 64). Murray's theory might thus be useful even to game studies, at least up to a point. However, it remains to be seen whether configurative media such as games will necessarily follow the same logic of disappearance that has governed print and film. Configuration after all differs fundamentally from interpretation. Although their behavior may be constrained by the arbitrary construct of a game, players are obliged to know the rules and repeatedly to consider a range of possible interventions, which leads to Murray's controversial distinction between "electronic closure" and catharsis. Arguably the immersiveness of games differs crucially from that of narratives, and much may depend on this difference.
Reducing any medium to transparency confines it to a fixed and usually limited range of function. Consider an example from recent history. We have heard many times that the U.S. craze for citizen's band (CB) radio died out at the end of the 1970s when users found nothing of importance to say on the air. This version of the story appears to confirm Murray's assumptions about media. Ostensibly, people stopped taking an active part in their audio entertainment and went back simply to listening. After briefly dropping its cloak of invisibility, radio reverted to its transparent state, a commercial, one-to-many technology under stable corporate control. The CB fad thus apparently reconfirms the general rule that media are private properties, not civic services. Never mind that commercial radio began to switch to talk formats even as CB faded, squeezing some of the upstart's crude popularity into more narrow and profitable channels. Disregard the very plausible connections from CB through Internet relay chat to instant messaging, where a popular form of communication has once again been devoured by oligopolies. Ignore the suggestion that private control of media may be an aberration, not the natural order of things. Transparent media may not bear much scrutiny, but happily for business elites, they do not present themselves for inspection.
If Murray's process of disappearance is truly inevitable, it seems clear that the turbulent "interplay" of forces she posits must eventually subside into the steady state of mass markets dominated by a few major interests. An invisible computer is most likely a monopolist's best friend -- a dictum that seems as true at Sun Microsystems, home of the NetPC, as in any precinct of the Redmond campus. Molecular society emerges in a paradoxical moment, as great transformations always do. The irruption of popular empowerment coincides with the climax phase in the evolution of oligopolies, a final division of very great spoils.
One can address this double logic more or less dialectically, as does Jay Bolter in his theory of remediation (Bolter and Grusin 2000), or one can learn from the story of Napster and other recent popular assaults on traditional profit centers that the major interests are less likely to seek compromise than continued dominance. The development of media in this century seems inevitably fraught with controversy, and as citizens of intensively informed societies we have great stakes in these oppositions, even when they appear to concern mere entertainment or play. Therefore if narrative forms play any role in the process by which "we lose consciousness of the medium," there may be good reason to turn away from storytelling as the prime agenda of art.
This schism could have significant consequences. Eskelinen defines "configurative practice" rather narrowly as the player's strategic operation upon the elements of a game, but it is possible to broaden this term significantly. If we conceive of configuration as a way of engaging not just immediate game elements, but also the game's social and material conditions -- and by extension, the conditions of other rule-systems such as work and citizenship -- then it may be very important to insist upon the difference between play and interpretation, the better to resist immersion. Any analogue of literacy for interactive media would probably need to encompass such resistance.
It remains to be seen, though, whether game theory will help in this social and cognitive undertaking. As promised, we return to Eskelinen's assertion that game players must "proceed from the beginning to the winning or some other situation." If digital game theory concerns itself primarily with choices that lead to winning situations -- solutions circumscribed within a narrow calculus of outcomes -- then it may be just as inimical as any narratology to a proper understanding of configurative practice. Limiting the definition of games to systems with simple distinctions between winning and losing could restrict this study to zero-sum antagonism, a domain that seems every bit as constrained and potentially obscuring as narrative. It might also lead the study of digital games into uncritical acceptance of existing genres. To be blunt, if we tie configuration inflexibly to some duelistic protocol, we might produce a game theory whose insights are limited by its gunsights.
In fact, Eskelinen complains at length about the limited conceptions of mass-market games, asking why they insist on simpleminded visual realism and leave so little room for strategic variation even within competitive contexts. He has proposed hypothetical games where players manipulate data gathered from external reality, or where game elements intrude into real space. In one of these scenarios, "[t]he Pokémons on the screen and in your living room... team up and steal your credit card numbers to order reinforcements" (Eskelinen 2001).
Although this last example is oppositional if not militant, Eskelinen's probes seem generally to lead away from simple contests toward the sort of complex, open-ended play that is more often called simulation than game.
Improvisations and simulations seem even further removed from teleological narrative than agons, whose general principle might still be confused with cathartic closure. Moreover, the kind of configurative practice involved in these activities offers an excellent countermeasure against the transparency of media. Eskelinen's hypothetical Pokémon uprising, for instance, nicely reverses Murray's disappearing act. While the experience would no doubt teach some "truth about our lives," that truth concerns the autonomy we grant to increasingly dynamic media. It might remind us that cyberspace is not a storybook or a moving picture but a complex virtual environment that should never be allowed to become second nature -- or which, at any rate, ought never to be given free access to our charge accounts.
Molecular Society and the State of War
This chapter began, however, by alluding to a different sort of credit or credibility: our need for information and perspective in a continuing crisis, a theme to which we must now return. It is one thing to dream up mischievous invasions by surrealist software, but what can such fantasies tell us about a world where terribly real attacks have murdered thousands? The events of late 2001 have been variously described as the end of postmodernism, the death of irony, and a very bad time for comedians. No doubt they must also overshadow any attempt to speak seriously about games; however, that shadow need not become a total eclipse.
At this point we might want to think beyond games in their most literal sense, or rather to see them as elements of a larger process, Lévy's transition from "molar" to "molecular" social orders. Molar technologies "manage objects in bulk, in the mass, blindly, and entropically" (Lévy, 40), whereas molecular technologies embody both a miniaturization of control (literally down to the molecular level in nanotechnology) and a devolution of that control throughout the human community. This pattern manifests itself as much in media and culture as it does in manufacturing or material technologies. Lévy notes:
Information technology is molecular because it does not simply reproduce and distribute messages... but enables us to create and modify them at will, provide them with a finely graduated capability of reaction through total control of their microstructure.... Thus digitization reestablishes sensibility within the context of somatic technologies while preserving the media's power of recording and distribution. (Lévy, 48-49)
Lévy's allusion to the "somatic" here is particularly important. The term means for him the embodied, the personal, that which requires particular human intervention or presence. Molecular media do not simply invite configurative practice, they require it. At their best they do not vanish into perfect transparency but present themselves as "a system-modeling medium," to borrow Murray's phrase, in which we must place configuration before interpretation. Increasingly, ordinary life requires us to apprehend these systems, to understand their rules and develop effective strategies for managing their effects and affordances. Thus, in their capacity to teach what Aarseth calls "ergodics" or multidimensional pathfinding, games represent an enormously powerful, perhaps a fundamental form of molecular culture. Games may be our surest route from the old world of "molar" literacy into electracy, ergodics, e-literacy, or whatever we finally call the new regime.
But how does this regime square with the latest new world order, with Mr. Bush's "first war of the 21st century," or as skeptics have it, the First Crusade? Since we have set aside Murray's values of transparency and immersion, we can assume that molecular media will not lend themselves readily to any general mobilization. On the contrary, cultivating more conscious, active engagement with information systems implies at least the possibility of opposition. Indeed, as we have already suggested, the political economy of media in the 21st century seems to demand dissent and intervention. The declaration (or acclamation) of war may distract attention from preexisting conflicts inherent in information culture.
The legal scholar Lawrence Lessig explains that "[w]e can build, or architect, cyberspace to protect values that we believe are fundamental, or we can build, or architect, or code cyberspace to allow those values to disappear. There is no middle ground" (Lessig 1999, 6). As Lessig sees it, a confluence of government and commercial interests must severely impede any movement toward Lévy's molecular society unless a constituency develops to oppose them. This resistance must be active and deliberate. Lessig insists that "[t]here is no choice that does not include some kind of building. Code is never found; it is only ever made, and only ever made by us" (Lessig 1999, 6).
It is tempting to compare Lessig's semi-utopian "us" to Lévy's equally hypothetical notion of "the just," the quiet minority of believers in human potential who embody "both the necessary condition of the universe and the superfluity that makes it worthwhile" (Lévy 1997, 39). To the extent that a culture of digital play increases the reach of this always dubious pronoun -- producing more people who refuse to play the accepted storylines, or who know enough about the techne of code to rewrite it -- then it may set a limit on the power of purely interpretive media. Such may be the dream.
No limit once set, though, ever escapes testing. Writing back in the old century, before the present terror war boiled up, Lessig worries primarily about efforts to foreclose on the development of open-source software. Access to source code, he rightly notes, allows users to gain control of system architecture, the most powerful regulating factor in cyberspace. Indeed, the ability to intervene at that architectural level must be another key component of molecular, configurative culture. He might see it as the most serious practical opposition that can come from the ethos of games and simulations; but its status remains very much in doubt.
Unlike Lévy, Lessig is a most reluctant utopian. He points out that the communitarian ethic of the early Internet has been largely eroded by dot-com profiteering, even as government and press have systematically demonized autonomous code writers as outlaw "hackers" (Lessig 1999, 8). These observations are sobering. No doubt the struggle over open code will become a major battlefront when the full implications of the war on terrorism present themselves. As government defines its concept of extended, perhaps unlimited conflict, controversies seem sure to erupt over privacy of communications, access to effective encryption, freedom of association and speech. If terror attacks recur or escalate, decision makers may move toward more rigid, unconfigurable architectures -- in effect locking down the terms of the national security state. Such developments would push back toward a more centrally managed, homogeneous, "molar" organization of media and society.
As one considers the possibilities for repression, the attraction of the old zero-sum game becomes strong, especially if one is among the majority of U.S. citizens who voted for someone other than George Bush in 2000. Yet if a configurative approach to media teaches anything, it is an appreciation of complex and multiple situations. As Thomas Pynchon noted, They-systems tend to breed We-systems; but to place absolute faith in either is to embrace delusion (Pynchon 1973, 638). The best kind of game culture must teach us when to jump outside the game. The contest of suits-versus-hackers, libertarian idealists against the military-infotainment state, is only part of a larger, and in Haraway's words a "deadly" interplay.
As Lévy notes, the climax of automation renders unto machines the machinic and leaves to us the unavoidable work of human beings: "the production of the social bond" (Lévy 1997, 21). The current state of war -- whatever anyone means by that term -- results not from one but many ruptures of this bond. Emergence of Lévy's molecular society is threatened just as desperately by international terror as it is by reactionary instincts of the new world order. Indeed, our culture's very capacity for change, its headlong rush into new forms of experience, seems a major cause of our enemies' resentment. These facts demand consideration no matter what sort of ballot one cast in recent elections. Production of a new, more just social bond goes far beyond the reinvention of media, or of literacy and its sequelae. We know that this task involves blood and anger as much as stories and games: all the more reason to be very, very serious about the subject of digital play.
Aarseth, Espen (1997). Cybertext: Perspectives on Ergodic Literature Baltimore. Johns Hopkins University Press.
---. (2001). "Game Studies Year One." Game Studies 1, no.1 (2001). http://www.gamestudies.org/0101/editorial.html.
Bolter, Jay David, and Richard Grusin (2000). Remediation: Understanding New Media. Cambridge, MA: The MIT Press.
Eskelinen, Markku (2001). "The Gaming Situation." Game Studies 1, no.1 (2001). http://www.gamestudies.org/0101/eskelinen.
Gallagher, David F. (2001). "Online Tie-In Outshines Film It Was Pushing, Fans Say." New York Times, July 9, 2001, C1, C6.
Haraway, Donna J. (1991). Simians, Cyborgs, and Women: The Reinvention of Nature. New York: Routledge.
Hardison, O.B. (1989). Disappearing through the Skylight: Culture and Technology in the Twentieth Century. New York: Viking.
Johnson, Steven (1997). Interface Culture: How New Technology Transforms the Way We Create and Communicate. San Francisco: Harper Edge.
Kaplan, Nancy (1995). "E-Literacies: Politexts, Hypertexts, and Other Cultural Formations in the Late Age of Print." http://iat.ubalt.edu/kaplan/lit/.
Krugman, Paul (2001). "The Fear Economy." New York Times Magazine, September 30, 2001, 38-41, 54-55, 84-85.
Landow, George P. (1992). Hypertext: The Convergence of Contemporary Theory and Technology. Baltimore: Johns Hopkins University Press.
Laurel, Brenda (2001). Utopian Entrepreneur. Cambridge, MA: The MIT Press.
Lessig, Lawrence (1999). Code and Other Laws of Cyberspace. New York: Basic Books.
Lévy, Pierre (1997). Collective Intelligence: Mankind's Emerging World in Cyberspace. New York: Plenum.
Memmott, Talan (2001). "Toward Electracy: A Conversation with Gregory Ulmer." Beehive 4, no.2 (June 2001). http://beehive.temporalimage.com/content_apps34/app_a.html
Moulthrop, Stuart (1999). "Misadventure: Future Fiction and the New Networks." Style 33, no.2 (1999): 184-203.
Murray, Janet (1997). Hamlet on the Holodeck: The Future of Narrative in Cyberspace. Cambridge, MA: The MIT Press.
---. (1999). "Interactive Design: a Profession in Search of Professional Education." The Chronicle of Higher Education 45, no.33 (1999): 4-6.
Pynchon, Thomas (1973). Gravity's Rainbow. New York: Viking.
Turkle, Sherry (1984). The Second Self: Computers and the Human Spirit. New York: Simon and Schuster.