HomeIntroducing CybercultureBook ReviewsCourses in CybercultureEvents and ConferencesFeatured LinksAbout RCCS

View All Books

Emergence: The Connected Lives of Ants, Brains, Cities, and Software

Author: Steven Johnson
Publisher: New York: Scribner, 2001
Review Published: October 2002

 REVIEW 1: Laura Kertz
 REVIEW 2: Nils Zurawski

Emergence may be the hot topic today across a wide range of disciplines, but in his new book Steven Johnson describes how a penchant for understanding systems in terms of structures and hierarchies has long skewed understanding of various natural phenomena and constrained development in certain technological areas. Subtitled The Connected Lives of Ants, Brains, Cities, and Software, Johnson's highly readable report on the topic offers accessible examples of emergent systems and suggests the importance of incorporating "bottom-up" organization into everything from city planning to political protest to entertainment programming.

A co-founder of the now defunct Feed, and a regular commentator on technology and media, Steven Johnson's name is well known in Silicon Alley. In this, his second book examining digital culture, Johnson begins with an analysis of slime mold, or more accurately, an analysis of the analysis of slime mold. Johnson explains how early study of the organism, which can exist in discrete single-celled units or under the right conditions aggregate into a slow moving colony, was waylaid by vain pursuit of a "pacemaker." Pacemaker cells are common in biology; they send out biochemical signals to instruct their neighbors to carry out some activity -- in the case of the slime mold, presumably, ordering the various cells to aggregate and disperse.

After years of observation failed to identify a slime mold pacemaker, it took a biomathematician who had been reading Alan Turing to solve the slime mold mystery. Readers familiar with the concept of emergent phenomena have a hunch that slime molds aggregate not based on the signal from a single pacemaker cell; rather, micro-behaviors (in this case the release of cyclic AMP and the response to encountering levels of the same) carried out by every cell result in the macro-pattern of colony aggregation and dispersal.

Johnson illustrates the sway that top-down models of organization hold over our interpretation of phenomena with a story involving prominent AI researcher Marvin Minsky. It seems Minsky was peering over the shoulder of colleague Mitch Resnick, who was viewing a video image of a slime mold simulation he had developed based on the chemical signaling model. The estimable Minsky mistook the chemical trails for food, assuming that the organisms were converging on an available resource, not communicating with one another.

Johnson's work takes on a journalistic style, peppered here and there with anecdotes like the Minsky incident. He relates the details of his interviews with various researchers and describes his own experience testing out bio-feedback. The effect is an extraordinarily accessible account that challenges the reader to adopt a new interpretive framework and rewards her with plenty of background story and interpretive support. Johnson rarely resorts to the practice of analogizing away intricate concepts; rather he provides a clear description and example after example.

Leaving the slime behind, Johnson turns to the Ant Queen, challenging our grade school understanding of the monarch insect and her teeming colony of subjects. He segues from Deborah Gordon's revealing studies of ant colony behavior to an historical account of the development of Manchester to software engineering and principles of cybernetics. Here, Johnson reiterates the centrality of feedback in an emergent system: positive feedback can send a system spiraling out of control; negative feedback can keep it finely tuned.

With these themes established, Part Two focuses on a more thorough analysis of emergence in various settings: the city street, the World Wide Web, electronic media, and computer gaming.

Johnson's faithfulness to Jane Jacobs' The Death and Life of the Great American Cities (1961), an analysis of urban development that emphasizes the information exchange of the sidewalk encounter, does away with well-intentioned simplifications like the notion that being exposed to 'poor people' is somehow good for the rest of us and explains rather that "encountering diversity does nothing for the global system of the city unless that encounter has a chance of altering your behavior." For example, one might decide "to move out of the neighborhood after you pass the hundredth dot-com kid on a cell phone" (96). Johnson's own love of great cities like New York and Paris and his infatuation with his own West Village neighborhood, also home to Jacobs, infects his discussion of urban patterns. He cites that hallmark of the LA lifestyle, the traffic jam, as an emergent pattern, indicative, however, of a poorly tuned system run amok.

Johnson's examination of older cities like Manchester, and especially Florence, as interfaces for storing information harks back to his previous work. Interface Culture (1997) offered a review of developments in interface design through the years as engineers and average people struggled to devise schemes for organizing and representing vast amounts of information. Himself a notable figure in electronic media, Johnson hits his stride in the section of Emergence addressing the World Wide Web. He describes the lack of feedback inherent on the one-way hyperlink and approaches that constant conundrum: the development of communities online. He describes how the limited bandwidth of online discussion caters to the "crank," filtering out, as it does, the social cues of eye-rolling, toe-tapping, and room-leaving that shield us from such figures offline.

My own experience in developing online community confirms his assessment. At Web Lab in NYC, I worked with Barry Joseph and Marc Weiss who together developed "small group dialog" software for managing asynchronous online discussions that imposed certain constraints on the process. Limiting group size and encouraging accountability, the system in most cases provided the necessary feedback to strike a balance: a rewarding discussion that permitted natural leaders to emerge and inhibited asocial activities like flame baiting and "drive by posting" [1].

Johnson also examines collaborative filtering systems, like the engine driving Slashdot and addresses perhaps the number two concern regarding civility online: that every netizen might eventually subscribe to the "Daily Me" (as envisioned by Nicholos Negroponte), that is, fine tune his or her filters for all incoming information based on personal preferences. Under such a scheme, what eventually emerges is a world wide cooperative of narrow-minded clods -- perhaps more diverse, but only marginally more palatable than the perennially reviled teeming masses. Johnson is willing to concede that the masses are only teeming because of the top-down organization of the mass media, and, he proposes, the masses won't persist much longer, not once we start forming bottom-up communities, along the way filtering our media, yes, but also forming ad hoc alliances with the like minded, fracturing the entertainment markets, and turning the tables on Madison Avenue. (Johnson even suggests that the "Daily Me" phenomenon might be countered by the number of thoughtful media consumers who fine tune to include a "diversity filter.")

It is here that Johnson's analysis begins to falter. His story of emergence plays particularly well as a description of phenomena in biology and sociology and as a formula for software engineering. (I haven't mentioned his engaging accounts of evolutionary software development and similar efforts in game design.) But his effort in Part Three to translate his theory of emergence to a predictive analysis of media is less sound. His vision of the future depends heavily on collaborative filtering; that is, copious feedback from user/consumers about their preferences and tastes regarding the media they encounter. It is also a future that is all pull.

The trouble (and here one must presuppose there's trouble) is that the current feedback loop connecting media producers with media consumers is unbalanced. An avalanche of programming and advertising comes down on the average user. The media programmer, however, conducts some focus groups, tests out pilots, and peruses some tables of Nielson ratings -- it's an asymmetric information exchange. To counter it, according to Johnson, first we've got to time-shift our media consumption. This process is already underway with Tivo and its cousins, and when the shift kicks in full gear, the "prime time" death grip on media programming will be broken. Throw in some collaborative filtering, so consumers can identify and ally with those with similar tastes, and soon the whole media sphere will crack wide open. New networks will form from the bottom up, as communities pick and choose their media from a variety of sources, and we the user/consumers will never again have to simply accept what's pushed at us.

It's a great story, and likely much of this will come to pass in the next couple of decades, but some important details have been glossed over. Johnson seems to suggest that if there were just enough avenues available for feedback, we might have better media. But even without a sophisticated filtering system, we do have a network for sharing our preferences with one another. And we all seem to agree that the mass media churn out garbage. Importantly, however, it is garbage that is extraordinarily easy to use. My own recent work looks at the symbiosis between the mass (push) and the niche (pull) markets. Mass market media cull themes from niche markets (say, research on anti-aging or S&M sex) and recycle them, puffing them up and smoothing out the edges to ensure palatability for the largest possible audience, and voila: a Dateline segment or an episode of Law and Order. Many of us can muster some passing interest in cutting edge biomolecular research or even whips and leather, but not as many of us are willing a) to seek out the information from a niche source or b) to identify ourselves (whether publicly or just as a personal acknowledgement) as members of those particular niche markets. Furthermore, our passing interests are just that, passing. We need our mass media every bit as much as the mass media need the niche. And on top of that many of us are lazy.

Johnson acknowledges as much in Interface Culture, where he offers a more compelling account of a media future in his rumination on intelligent agents. (Perhaps this is why he only devotes a few pages to advertising and predictive systems in Emergence.) There, (Interface Culture) he posits a new future for push, but a predictive push. Intelligent agents that learn your tastes based on pattern matches with countless other users suggest programming to you. They push what they think you'll enjoy, based on past pull. He describes the difficulty, however, in fine-tuning such a feedback mechanism. Johnson calls it the "Beatles and Bach syndrome," where a smart prediction from your agent (everyone likes Bach and the Beatles) is of little use to you. Johnson offers a few tips for filtering out "low-information" picks and suggests a more likely scenario of wild turbulence in the music industry/system.

It's easy to find other quibbles with Johnson's examination of trends. He speaks favorably, for example, of the feedback mechanism inherent in the eBay site design, where partners in a transaction rate one another, and community members develop a reputation, available for scrutiny by all. Phil Agre for one, has denounced the eBay mechanism as wildly unreliable, suggesting that users have incentive to praise one another -- my good marks for you mean more good marks for me [2]. It happens I come out on Johnson's side in this debate, but it's worth noting that the inherent value of the system can't be pre-supposed.

I find it particularly odd that Johnson insists on tough spam legislation as a bedrock of our media future. The stance is inconsistent with his optimistic view that users will invest considerable energy rating all the media they encounter (from music to advertisements.) Users so avid about filtering for the media they want could just as easily develop filters to avoid what they don't. After reading a piece by John Gilmore on the topic [3], I finally conceded my own personal spam war and began investing my energy not in reporting religiously to SpamCop, but in tweaking my Eudora filters even finer.

Not that I believe the average media consumer could be bothered to do the same. But there are other spam solutions on the horizon that don't involve legislation and don't depend on user participation. SpamAssasin and Vipul's Razor are both tools that resonate with the other systems Johnson describes. SpamAssassin detects spam using textual analysis, and it integrates with Vipul's Razor, a collaborative filtering database.

More importantly, however, I am skeptical that the average media user will even be bothered to provide all the feedback that powers Johnson's vision of the future. I'm more inclined to consider a future that looks a lot like the present, where the masses consume whatever's most accessible, and the few dig for the rest. That could mean the difference between fine tuning your own individual agent to prowl the media sphere for you and simply using an off-the-shelf version preloaded with preferences for the next big thing.

But prognostication is a tricky business. It's far easier to critique a prediction than to formulate one from scratch, and Johnson's work is a significant contribution. For one, he offers a story of emergence, one that ties together developments across a variety of disciplines. The value of trend spotting comes anyway not so much in the verity of the predictions, but in the handling of the themes that underpin them. For his part, however, Johnson offers a compelling vision. A true believer in the function and the sway of the avant garde (whether of music, city planning, or software design), he supposes a future that depends entirely on the creativity of the present.

1. For more about the "small group dialog" software and the technique, see http://www.weblab.org/sgd/

2. Phil Agre, "The market in marketplaces: Some notes on the dubious case of eBay," Red Rock Eater Digest, May 2000. [July 21, 2002].

3. John Gilmore, "What to do about spam? Use smarter mail readers," post to Politech list, February 28, 2002. [July 21, 2002].

Laura Kertz:
Laura Kertz is a recent graduate of the City University of New York Graduate Center. Her Master's thesis examines media deviance, contrasting the obscenity wars of print and broadcast with the suppression of DIY technologies in the digital age. Laura currently manages development of a research library on data privacy.  <kertz@earthlink.net>

©1996-2007 RCCS         ONLINE SINCE: 1996         SITE LAST UPDATED: 12.10.2009