“We are living through a movement from an organic industrial society to a polymorphous information system – from all work to all play, a deadly game.”
Donna Haraway, The Cyborg Manifesto
In his final book, published in 1964 at the height of the industrial boom under the title of God & Golem, Inc., the scientist Norbert Wiener asked a question: “Can God play a significant game with his own creature? Can any creator, even a limited one, play a significant game with his own creature?”1 The example he used was trivial: a computer program for playing checkers, written by A.L. Samuel of the IBM corporation. As for the definition of “significant,” it’s not very clear: but Wiener does observe that just as in the contest between God and Lucifer, the programmer may well lose the game.
He had reason to be nervous. During the war he had worked on electronic targeting mechanisms and had come to conceive the feedback loop as a model for every kind of purpose, whether of animals or machines. In December of 1944, acting jointly with his colleagues Howard Aiken and John von Neumann, he invited a select group of researchers to join a “Teleological Society” to study the intersections of neurology and engineering.2 The name made use of a term that had previously been reserved for the final causes of speculative philosophers and theologians. Soon after its first meeting, the Teleological Society transformed into the famous Macy Conferences on “Circular Causal and Feedback Mechanisms in Biological and Social Systems” – a title summed up as “Cybernetics” after Wiener had coined the word in 1947.
In the course of that year he publicly renounced any direct collaboration with the military brass and the giant corporations. He was repelled by his wartime experience and sought to exercise his mind against nature alone, a passive, transparent, Augustinian nature harboring no hidden intentions, and not some Manichean universe full of opaque bluffs, evil designs and dissimulations. He did not want his new science to develop as a calculator’s battle against an unseen, calculating enemy.3 This anti-militarist stance placed him at odds with the fiercely anti-communist Von Neumann, a mathematical genius and a central figure in the creation of the atom bomb. Von Neumann, who attended Atomic Energy Commission meetings on a wheelchair, is thought to have been among the models for Stanley Kubrick’s Dr. Strangelove.4 One of his theories, developed extensively by the mathematicians at the RAND corporation, sought to identify the most rational strategies for any two-person game by relentlessly calculating all the possible moves of each player.
Wiener saw Von Neumann’s game theory as deterministic and scientifically outdated. He preferred the statistical analysis of stochastic processes, and a policy of continuous error-correction rather than any quest for absolute certainty. By the 1960s he was increasingly concerned that decision-making might be taken over by game-theoretical robots, capable of learning checkers and many other things – until one day, like the Golem, they would run amok and unleash some kind of Doomsday Machine. In the face of that final cause, every human game would become insignificant.
Today Dr. Strangelove has receded into the never-never lands of science fiction and game theory no longer unnerves the general public. But for an understanding of the God and Golem equation in the postindustrial information age, one need only look closer into the nature of Wiener’s own research during WWII. Here, in effect, lay the origins of his revulsion. Beginning in 1940, he set to work on a closed-loop information system called an antiaircraft predictor. This was a three-part problem: use radar to record the zigzagging path of an airplane performing evasive maneuvers; calculate the probabilities of its future course based on its past behavior; and convey this information to a servomechanism that would correct the firing of the gun – an operation to be repeated in a continuous, circular fashion. Yet more was at stake than a sensor, a calculator and a servomotor, because the gun, like the enemy airplane, was also connected to a human being. This, for Wiener, was fundamental:
It does not seem even remotely possible to eliminate the human element as far as it shows itself in enemy behavior. Therefore, in order to obtain as complete a mathematical treatment as possible of the overall control problem, it is necessary to assimilate the different parts of the system to a single basis, either human or mechanical. Since our understanding of the mechanical aspects of gun pointing appeared to us far ahead of our psychological understanding, we chose to try and find a mechanical analogue of the gun pointer and the airplane pilot. In both cases, the operators seemed to regulate their conduct by observing the errors committed in a certain pattern of behavior and by opposing these errors by actions deliberately tending to reduce them…. We call this negative feedback.5
The upshot of Wiener’s prediction research was a double inscription of the “human element” into the system: on the one hand, as a servomechanism, pointing the gun or steering the plane, and on the other, as a source of information for the feedback loop. The historian of technology, Peter Galison, stresses the mechanical side of the equation: “The core lesson that Wiener drew from his antiaircraft work was that the conceptualization of the pilot and gunner as servomechanisms within a single system was essential and irreducible.”6 Philip Mirowski, in his study of the cybernetic model in economics, lays the emphasis on the informational aspect of the paradigm: “The physical and the human both had to undergo ontological metamorphosis into ‘messages with noise’ in order to be combined into a new synthesis.”7 But Galison and Mirowski are speaking of the same thing: the infomechanical being that emerged from the Second World War.
Its double constitution could be felt in the uncanny identity of the strange new creatures that fired the guns and piloted the planes: both seemed to waver between machinelike, implacable humans and intelligent, humanlike machines. Where did this uncanniness come from? Galison’s insight was to realize that the closed-loop information machine, in its circular, self-correcting unity, was ultimately defined by the opaque maneuvers of the dodging pilot in the plane, whenever he was pursued by the aggressive eye of the gunner. In other words, cybernetics was a Manichean science, permeated by the violent interrogations of its subject and the dissimulating absence of its object. This founding relation makes up what Galison calls “the ontology of the enemy.”
The systemic unity of man and machine, split at its heart by an ontology of the enemy, is what I will explore in this essay, in order to gain a new understanding of surveillance. But the concept of surveillance itself will have to be expanded far beyond its traditional range. Here is the thesis in a nutshell. The automated inspection of personal data can no longer be conceived as a purely negative function, an all-seeing eye, a hidden ear, a baleful presence behind the scenes. The myriad forms of contemporary electronic surveillance now constitute a proactive force, the irremediably multiple feedback loops of a cybernetic society, devoted to controlling the future. Conflict lodges within these cybernetic circles. They knit together the actors of transnational state capitalism, in all its cultural and commercial complexity; but their distant model is Wiener’s antiaircraft predictor, which programs the antagonistic eye into an obedient killing machine. Under the auspices of a lowly servomechanism coupled into an informational loop, we glimpse the earliest stirrings of the Golem that matters to us today, in the age of data-mining and neuromarketing. And this Golem is ourselves, the cyborg populations of the computerized democracies.
Our movements, our speech, our emotions and even our dreams have become the informational message that is incessantly decoded, probed, and reconfigured into statistical silhouettes, serving as targets for products, services, political slogans or interventions of the police. Each of us, paradoxically, is at once the promise and the threat of the future, which itself is our Telos, our God, our Creator. And so, under the incessant scrutiny of today’s surveillance technologies, Wiener’s philosophical question returns in an inverse form. Can a creature play a significant game with her creator? Can we play a significant game with the cybernetic society that has created us?
To set up the context of this question, I would like to introduce four characteristic technological systems, which together trace out the contours of our society. These systems are all of North American origin. They illustrate how the hegemonic power spends its immense defense budgets on “dual-use” technologies, both civil and military, which continually intertwine with each other even as they reshape the emerging global order.8 You might think of these four systems as cardinal points, or even mapping instruments: they exemplify the way that concentrated computing power charts out the present, in order to wipe clean the slates of the past and colonize the future.
The Joint Helmet-Mounted Cueing System is a semi-opaque visor set into a magnetic helmet that tracks where the pilot’s head is pointing.9 It functions as a display surface, replacing the traditional control panel and allowing the pilot to read aircraft performance, targeting information, weaponry status and threat predictions from the greenish letters of a computational scrim that remains constantly within his field of vision. At the same time, he is able to lock on a Sidewinder missile by just looking at its target. The helmets are made by Vision Systems International, a joint venture between Rockwell Collins and Elbit Systems of Israel. The fighter-plane cockpit places the human being at the junction between information-delivery systems and a whole battery of controls and launch mechanisms, to be operated in quasi-extraterrestrial environments. It is the ultimate man-machine interface, something like the cyborg’s natural home.10 It is here that new answers are constantly found to the question raised by military psychologist John Stroud at the Sixth Macy Conference in 1949, way back at the dawn of cybernetics: “So we have the human operator surrounded on both sides by very precisely known mechanisms and the question comes up, ‘What kind of a machine have we placed in the middle?’”11
InferX privacy preserving real-time analytics is a data-mining tool based on previous research carried out by the parent company, Datamat, for the targeting of missile interceptors.12 It works by inserting an “InferAgent” program into an entire range of computer systems – banks, airports, ticketing agencies, harbor authorities, etc. – then using encrypted transmissions to perform real-time pattern-recognition analysis on the data that circulates through those systems. The software is promoted by Michael Brown, the disgraced former head of the Federal Emergency Management Agency (FEMA): “What these algorithms do is they look at what’s the normal pattern for any given set of data points, and if those veer off by any fashion, then the protocol says you need to look at that.”13 InferX is designed to hunt around the world for “unknown unknowns”: those things that “we don’t know we don’t know,” as Donald Rumsfeld puts it. Because the data is not physically warehoused, it escapes the restrictions placed by Congress on DARPA’s Total Information Awareness. Indeed, the company has actively marketed its system for the US military’s TANGRAM project, which effectively replaces TIA.14 And InferX is a dual-use technology, including a marketing application: “InferCluster uses the same distributed architecture as InferAgent to send agents over networks for the clustering of groups of objects with similar features from multiple data sources. InferCluster can be used to group customers with similar purchasing behavior, or to even discover patterns of who is not buying and why.” In that last phrase, one begins to sense the disquieting pervasiveness of what Peter Galison calls “the ontology of the enemy.”
The Personicx customer relationship management system , developed by the Acxiom corporation, divides the entire US population into 70 demographic clusters, according to “age, estimated household income, presence and age range of children, marital status, home ownership status, estimated net worth and population density.”15 The system is built on Acxiom’s InfoBase, which is the largest continuously updated consumer database in the United States, containing public tax and census information as well as innumerable bits of data culled from the records of corporate clients. The database covers some 110 million households – basically the entire marketing universe of the United States – and, unlike the geodemographic systems of rival companies such as Claritas, it provides direct-mail, telephone and email access to individual households, not just zipcode groups. It profiles the cultural background, lifestyle, hobbies and aspirations of each cluster, and it also tracks them through life-stage changes, allowing for what Acxiom calls “preemptive marketing,” or the chance to begin pitching products and services to households shortly before they enter a new phase. The resources of companies like Acxiom are increasingly used by politicians. As Democratic campaigner Terry McAuliffe said: “If I want to sit at my desk, pull up on the screen the state of Ohio, and say, ‘Who in Ohio says that education is going to be the number one issue they’re going to vote on,’ six seconds later, 1.2 million names will pop up. I then have the ability to hit buttons and do telemarketing to them immediately, or to send emails to them immediately, send direct mail to them immediately, or actually send someone to their door to talk to them.”16 The technology of the “panoptic sort,” studied in the early 1990s by Oscar Gandy,17 has taken a quantum leap forward – and it will take another one very soon, when the lifestyle information offered by social-networking sites like MySpace starts being exploited by the data-miners.
Orbit Traffic Management Technology , sold by the ShopperTrak corporation,18 is the last point on the quadrant. It consists of an unobtrusive ceiling-mounted video camera that compiles records of customer movement through the store and correlates them with both sales figures and labor-force data. Up to 254 units can be networked to cover large areas, and cameras can also be installed outside to compare how many people pass by and how many actually enter. The data is transmitted to ShopperTrak’s treatment center, where it is processed and presented on a web-platform for remote access by management. The point is to use the information as a guide for adjusting in-store traffic flow, product placement, signage and advertising. The effectiveness of the design changes can then be checked against the hard data of sales. The cash-register results of individual stores can also be compared with macro-trends at the regional and national levels, allowing for performance benchmarking. Even more crucially, real-time data on regional and national sales of a given product line can be used for hour-by-hour adjustments in the size of the retail labor force, by means of an application called ESP, or “Easy Staffing Planner.” In this way, businesses are expected to move toward “customer experience management,” which consists of an ability to reconfigure both the built environment and the reception staff in real time, in order to more efficiently capture the client’s desire and convert it into sales. The ideal seems to be a situation where a single look leads inevitably to a purchase.
Each of these four technologies represents a major innovation in its class. But at the same time, they are only a tiny part of a vastly wider range of surveillance techniques, all integrated to larger control systems which increasingly rely on predictive algorithms. When surveillance develops to this degree you can say goodbye not only to privacy, but to the entire public/private divide on which individual choice in a democracy was founded. Today, what Habermas called the “structural transformation of the public sphere” has crossed another threshold.19 In the twentieth century, it was a matter of large-scale news and advertising companies distorting the public sphere in which ideas are exchanged. Now we are heading toward an entirely different kind of society, based not on informed debate and democratic decision but on electronic identification, statistical prediction and environmental seduction. A society whose major professional preoccupation is preemptively shaping the consciousness of the consumer. In this kind of society, the ciphers of opportunity presented by marketing data are never very far from the targeting information thrown off by an evasive enemy.
The four examples I’ve presented take us from looks that kill, with the helmet-mounted cueing system, all the way to looks that consume, with customer experience management. In between, they show how data-mining provides the power to identify probable criminals or terrorists, but also probable buyers of a product or voters for a candidate. This kind of “future mapping” via the combination of data collection, predictive analysis and environmental simulation could be found in dozens of other realms, from traffic control to finance. In every case, the tracking and analysis of human beings helps to configure a man-machine interface. The classical example is the explicitly cyborg form of the pilot inside his molded cockpit, which has led to extensive development of flight-simulation devices for both testing and training.20 But the most extensive condition of interface arises from the relation between mobile consumers and what the architectural critic Sze Tsung Leong calls “control space,” i.e. urban design shaped by real-time information on the aggregate behavior of individuals.21 The word “control” has a precise meaning here: it refers to the continuous adjustment of an apparatus, or in this case, an environment, according to feedback data on its human variables. The environment is overcoded with an optimizing algorithm, fed by data coming directly from you. This notion of continuous adjustments to an overcoded environment is key, if we want to understand the pervasiveness of surveillance in today’s societies – a pervasiveness that goes well beyond military, police and secret-service functions. To understand contemporary surveillance, however, requires abandoning two commonly held ideas: the literary image of Big Brother peering out from a screen, and the more complex architectural image of the Panopticon.
What’s interesting is that both these images correspond to comprehensive models of society and subjectivity. The world of Orwell’s 1984 is not only defined by a camera hidden in a telescreen, manned by secret police watching out for crimethink. It’s also a regime of absolute identification with the dark-haired, mustachioed image of Big Brother, and absolute rejection and hatred of the Jewish traitor Goldstein. 1984 depicts a totalitarian state, regulated by arbitrary trials, torture and spectacular executions, and articulated by the language of Newspeak that allows for no internal contradictions, indeed, no difference whatsoever in society or the inmost conscience of the individual. But in that respect it’s an archaic image, one that corresponds very little to the world in which we live, even if there are thousands of NSA operatives devoting all their time to spying on specific persons, and even if there are orange-suited prisoners held in the spectacular torture centers of Guantánamo.
Similarly, the Panopticon is not just a circular building with windowed cells and a central tower outfitted with venetian blinds, where a functionary can watch a prisoner’s every move without himself being seen. It’s also a world of proliferating files, dossiers and case histories, each administered by professionals who seek to reform and retrain the individual, to ingrain a discipline into his mind, emotions and reflexes, a discipline that will operate even without the all-seeing eye. Panoptic society is a bureaucracy that individualizes its subjects through the imposition of a regular and codified system of differences, creating functional categories of able-bodied men and women whose actions and gestures can be articulated into a productive whole, and whose truth can be distilled into the discourses of specialists. Despite their inexorably ramifying knowledge, these specialists always retain something of the warden, the doctor, the educator, shaping pliable personalities within the stable framework of all-encompassing institutions. But as we know, such clearly defined institutions with their carefully molded subjects are increasingly hard to find in present-day society, even if we do not lack schoolmasters, sergeants and psychiatrists in the pay of the state.
It’s obvious that both Big Brother and the Panopticon are dated, though they have not entirely disappeared. The question, then, is how do we characterize a surveillance regime that is neither totalitarian nor disciplinary, but depends primarily on the statistical treatment of aggregate data in order to shape environments in which populations of mobile individuals can be channeled and controlled? How, in other words, do we understand the political economy of surveillance in a cybernetic society?
It’s astonishing to see how Foucault, in his 1978 lectures at the Collège de France, immediately begins to distance himself from the image of the Panopticon and the concept of a disciplinary society that he had advanced only two years before, in Discipline and Punish. The 1978 lectures are entitled Security, Territory, Population. They deal with what Foucault calls “security devices,” or the regulatory mechanisms whereby the economic activity of a population is both optimized and protected against disruption.22 The first example is a mid 18th-century redevelopment plan for the city of Nantes, which involves cutting out new streets to serve four overlapping functions: the aeration of unhygienic neighborhoods; the facilitation of trade inside the city; the direct connection of the streets to long-distance transportation networks; and the surveillance of traffic in an urban environment that is no longer walled or subject to curfew. The keyword here is circulation. Instead of developing closed, precisely defined spaces for exclusive uses, as in a disciplinary architecture, the plan creates an open series of multifunctional devices that can expand in various directions according to patterns of future growth that can only be foreseen as probabilities. Further examples include the treatment of the plague by an identification of its transmission vectors, or the mitigation of famine by economic adjustments that discourage the hoarding of grain. In each case, the nature of an existing phenomenon and its effects on a population are carefully analyzed before any measures are taken. The aim of the liberal art of government is not to punish, transform or even save individuals, as in a disciplinary regime, but instead to arrive at the optimal distribution of certain phenomena in society, “to reduce the most unfavorable, deviant normalities in relation to the normal, general curve.”
All of this is quite unlike a sovereign upholding an arbitrary and terrifying law (which was the role of the ancient kings, or of Big Brother). But it is equally distinct from an administration imposing disciplinary routines on an individual (which is the effect of panoptic surveillance, whether in prison or on the factory floor ). It is now a matter of political economists adjusting the parameters of an open environment so as to stimulate and channel the probable behaviors of a population, and to manage the risks entailed by its free and natural mobility, or indeed, by the expression of its desire. The problem of governments under this liberal paradigm, Foucault explains, “is how they can say yes; it is how to say yes to this desire.”
What’s impressive here is the about-face in Foucault’s theory of the panoptic order – a rethinking motivated by the rise of neoliberalism, amid the shift to a post-industrial society. He goes so far as to say he was wrong when he claimed in his work on the prison that the disciplines were the coercive “dark side” of Enlightenment liberties, the fundamental mechanisms of power lying beneath the formal surface of liberal theory. Instead, he now maintains, “freedom is nothing else but the correlative of the deployment of apparatuses of security.” The two, in other words, evolve as a function of each other. Developing that same idea a year later, he declares with a certain irony that the liberal art of government “consumes freedom” – “freedom of the market, freedom to buy and sell, the free exercise of property rights, freedom of discussion, possible freedom of expression” – and therefore, “it must produce it, it must organize it.”23 It must provide the institutional environment for the exercise of certain freedoms, including the conditions under which one person’s freedom can be prevented from limiting another’s, or indeed, from threatening the entire mechanism of economic exchanges. The liberal art of government, for Foucault, consists in intervening not on the players but on “the rules of the game.”
From here it would have taken just one more step to foresee how the statistical interpretation of computerized surveillance data would open up entirely new possibilities for the governance of mobile populations circulating through the world space. In effect, the analysis of liberal economic regulation allows us to understand the tremendous incentives for the global deployment of feedback environments since the close of the Cold War. Cybernetics – whose etymology means both “steersman” and “governor” – has become the applied social science of control at a distance, the necessary correlate of American aspirations to global free trade, and indeed, to liberal empire. This relation between classical liberalism and technological control would have been faintly visible some three decades ago, for someone trying to look into the future.24 But Foucault was not a social forecaster, as the sociologist Daniel Bell claimed to be. Instead he worked as a genealogist, examining the successive historical strata that combine in the present. He conceived the security devices as an eighteenth-century addition to the disciplinary procedures of the sixteenth century, just as those procedures had been superimposed on the juridical forms of medieval sovereignty: “There is not a series of successive elements, the appearance of the new causing the earlier ones to disappear. There is not the legal age, the disciplinary age, the security age… In reality you have a series of complex edifices… in which what above all changes is the dominant characteristic, or more exactly, the system of correlation between juridico-legal mechanisms, disciplinary mechanisms, and mechanisms of security.”25
It is the complexity of such an architecture that we must take into account, if we want to develop an image of surveillance within the wider panorama of the corporate and military order. The difficulty, in a fully fledged neoliberal society, is to see how a wide range of different actors continually attempt to manipulate the environments in which individuals freely take their decisions; and to see in turn how state power intervenes at the highest level, with attempts to readjust the concrete “security devices” of the corporations and the police, along with the broader and more abstract rules of economic governance. The difficulty, in short, is to create the image or the metaphor of a deeply Manichean society where, as Daniel Bell observed, “games between persons” have definitively replaced any kind of collective struggle against nature.26 This society, which displaces so much of its conflict into the future, is nonetheless the present framework in which individuals, groups and populations all become cyborgs, that is, people bound inseparably to machines, struggling to make sense and to achieve purposes within mediated environments that are expressly designed to manipulate them. But this is also the framework that a neoconservative state power like that of the Bush administration seeks to restructure, by reinforcing the earlier paradigms of military discipline and sovereign law. Very few people have sought to theorize this highly unstable condition of governance; but has anyone managed to crystallize it in an image? And has anyone managed to oppose it with what Foucault would have called “counter-behaviors”?
One of the most original images of data-gathering technologies is proposed by William Bogard, in his book The Simulation of Surveillance. Going beyond Big Brother and the Panopticon, he explores an imaginary future – or “social science fiction” – where surveillance outstrips itself to become simulation, a virtual reality in which crime is already vanquished and desire is already satisfied. Bogard is keenly aware of the historical role of cybernetics in preparing the ground for such a society, as he indicates by speaking of simulation as “hypersurveillant control.” But he works in a Baudrillardean vein, with an ecstatic fascination for the synthesized image. Simulation, he writes, “is nothing less than perfect surveillance, surveillance raised to the highest power, where nothing escapes the gaze. Everything already observed, absolute foreknowledge of events grounded in the possession of the codes which generate them.”27 There is something very close here to a game-theoretic vision, in which all the moves are already known and all the strategies have already been played. Bogard probably felt vindicated by movies like The Truman Show , or even better, The Matrix, both of which came out after his book. But what gets lost in the fascination of simulation is the fundamental paradox of control, its Manichean nature.
Another film offers a stranger and more searching image of surveillance, though without quite matching the science-fiction story on which it was based. I’m thinking of Steven Spielberg’s Minority Report, which tells the tale of the experimental “Pre-Crime Department” of the Washington D.C. police in the year 2054. Spielberg is known for special effects, and some of them go straight to the point. The chase scene captures the ambiguity of contemporary identification and tracking technologies by imagining their logical development in the future. Billboard advertisements spring to life, activated by a retinal scan, to call out the name of the central character John Anderton as he strides anxiously through a corridor to the subway. In a bit of poetic justice, American Express, one of the pioneers of the “panoptic sort” studied by Oscar Gandy, gets the highest visibility in this thirty-second orgy of brand-name seductions. Another quick scan at the subway turnstile epitomizes the convenience of biometric identification. And the matching cut to the police, tracking their prey through the transport system, recalls the price we pay for it. Later on, this imaginary vision comes extremely close to Foucault’s notion of enforced optimization, when the inventor of Pre-Crime, police commissioner Lamar Burgess, addresses a crowd of people celebrating the extension of the device to the entire country. He says to them: “Enjoy yourselves! That’s an order.” And everyone seems delighted to hear even the police commissioner saying yes, saying yes to their desire. Still the most powerful, most haunting image in the film is that of the precognitives themselves: strange, misshapen creatures, pumped full of drugs, bathing in some amniotic solution, with electrodes pressed to their heads to read off their visions of the future.
These three creatures are clearly cyborgs. Yet rather than being outfitted with powerful mechanical prosthetics and assisted with augmented cognitive faculties, as in fighter-plane cockpits or in movies like The Terminator, here they are merely monitored, probed to their innermost imaginings. It is the sensitivity of their emotional responses to the world that makes it possible for the police to predict the future. Philip K. Dick’s short story is worth quoting here::
In the gloomy half-darkness the three idiots sat babbling. Every incoherent utterance, every random syllable, was analyzed, compared and reassembled in the form of visual symbols, transcribed on conventional punchcards, and ejected into various coded slots. All day long the idiots babbled, imprisoned in their special high-backed chairs, held in one rigid position by metal bands, and bundles of wiring, clamps. Their physical needs were taken care of automatically. They had no spiritual needs. Vegetable-like, they muttered and dozed and existed. Their minds were dull, confused, lost in shadows. But not the shadows of today. The three gibbering, fumbling creatures, with their enlarged heads and wasted bodies, were contemplating the future.27
In the movie, Spielberg has the precogs generate mental images of the future, without any mediation of computer analysis. He makes them self-aware, conscious of their visions and even able to suggest a course of action, as when the precog Agatha tells Anderton that he can change the future. But in that way, Spielberg simplifies a metaphor that was much more brutal and precise in Dick’s short story. There the precogs are pure sensibility, without reason or personal identity – something like the “reptilian brains” that contemporary marketers try to map out in their experimental subjects.28 The precogs, in Dick’s story, are uncanny, Golem-like creatures, wavering between men and machines. They stand in for the populations whose affects and mental activities are relentlessly probed and palpitated, so that their aggregate data-image can be mirrored by seductive products and waking dreams.
Other elements from the narrative are also lost in the film. Spielberg and his scriptwriters make the Anderton character into the victim of a plot woven by his hierarchical superior, Lamar Burgess, in order to cover up the killing of Agatha’s mother, who sought to take the precog back from the police. The result is a typical emotional drama, focused on the daughter’s anguished visions of her mother’s death and on Anderton’s parallel memories of his own murdered son. Whereas in Dick’s vastly more paranoid imagination, the plot against Anderton is a way for the Army to abolish Pre-Crime as an independent department and to wrest the control of the future back from the civilian authorities. What’s more, Dick gave a precious indication in the story, having Anderton explain that when he worked out the theory of Pre-Crime he refused the temptation to apply it to the stock market, where he could obviously have made fortunes. Had Spielberg been able to seize these two motifs – the relation to finance, and the army’s hunger for power over the civilian state – then the film, which came out shortly after September 11, could have become the metaphor of an entire epoch.
Truth is stranger than fiction. The neocon takeover of the American state effectively transferred power to the President as Commander-in-Chief of the military, and to the Pentagon under Rumsfeld. The oil and arms industries that had taken a back seat to finance in the 1990s now returned to the forefront with a vengeance.29 A financially driven liberal regime regressed to its disciplinary reflexes under a resurgent sovereign gaze, as the “complex edifice” of power suddenly shifted on its bases. In a world where the speculative futures of the long stock-market boom had collapsed, the fabricated need to invade Iraq became a new kind of self-fulfilling prophecy, a vastly more violent way to shape the future.
In 2002, shortly before the invasion of Iraq, the Defense Advanced Research Projects Agency launched what may have been its most twisted program ever: FutureMAP, or “Futures Markets Applied to Prediction,” developed as part of the Total Information Awareness program under the authority of a convicted criminal, the retired Admiral John Poindexter.30 Here one can observe a precise and yet insane readjustment of what Foucault called “the system of correlation between juridico-legal mechanisms, disciplinary mechanisms, and mechanisms of security.” Even as Minority Report was hitting the movie theaters, consultants for the United States Department of Defense were proposing a computerized “Policy Analysis Market” (PAM) that would mobilize the predictive capacities of investors by getting them to bet their money on civil, economic and military trends in Egypt, Iran, Iraq, Israel, Jordan, Saudi Arabia, Syria and Turkey. Finance, which for twenty years had been at the leading edge of cybernetic transformations, would now be repurposed for the needs of sovereign and disciplinary power. In this way, the distributed intelligence of the market would be harnessed and the price signals given off by these fictional “futures” would indicate the likelihood of given trends or events.
At this point it’s worth quoting from the mission statement of the Total Information Awareness program, because it exemplifies the military interpretation of the kinds of feedback loops that I have been discussing: “The DARPA Information Awareness Office (IAO) will imagine, develop, apply, integrate, demonstrate and transition information technologies, components, and prototype closed-loop information systems that will counter asymmetric threats by achieving total information awareness useful for preemption, national security warning, and national security decision making.”31 The Policy Analysis Market would be a sensing device in such a self-regulating, closed-loop system – like a human thermostat connected to the inferno of American economic, diplomatic and military power. A mockup of the trading interface, prepared by the Net Exchange company, shows Special Event Contracts concerning such eventualities as “Jordanian monarchy overthrown in 4th [quarter] 2004,” or “Arafat assassinated in 4th 2004”; while a Global Contracts section includes ranges of possible bets on “terror deaths” and “US military deaths.” The trading function are overlaid on a map of the Middle East, like windows of geopolitical risk-opportunity. This interface, and the lure of profit it offered, would be the electrodes attached to the precognitive lobes of the investors. If they produced striking images, then preemptive policies would follow.
The PAM trading interface is literally a map of the future. It is also a perfect example of what Foucault calls a “security device,” offering precise insight into the dynamics of surveillance under cybernetic capitalism. It is not a police program, but a market instituted in such a way as to precisely condition the free behavior of its participants. It produces information while turning human actors into functional relays, or indeed, into servomechanisms; and it “consumes freedom” for a purpose. Like all security devices, it serves two functions. One is to optimize economic development: in this case, the development of financial speculation. But the other function is to produce information that will help to eliminate deviant behavior, of the kind that can’t be brought into line with any “normal” curve. This is the double teleology of closed-loop information systems in cybercapitalism. The map of the future is always a promised land to come. But there are always a few enemy targets on the way to get there. The question is, do you hold the gun? Do you watch as the others take aim? Or do you try to dodge the magic bullet?
The heraldic emblem of Total Information Awareness – a sky-blue sphere encompassing an earthly globe caught in the gaze of a radiant eye detached from the summit of a Masonic pyramid – is surely the purest expression of the exorbitant will to power unleashed on the twenty-first century. But all around the world, complex systems are striving to realize the goals of Wiener’s original predictor, which itself had been a practical failure, destined for the closets of useless circuitry and the fevers of theoretical dreams. The sleep of reason under informatic surveillance gives rise to God machines. Yet every new claim to “shock and awe” or “full-spectrum dominance” is ill-conceived, illusory, useless.
The latest financial crisis, unfolding as I write in late 2007, is caused in part by the inability of banks to even know who will take the inevitable loses on subprime loans, since these have been bundled by computer into ultra-complex collateralized debt obligations (CDOs), themselves further collateralized into derivatives called “CDO-squared,” whose monetary value has become almost impossible to assess.32 Meanwhile the “surge” of fresh (or more often, returning) American troops in Iraq effectively defends the Stars and Stripes under the gaze of the media, but only on small parcels of territory and at limited hours of the day. Victory, too, has become hard to calculate. And as the humiliation of anticipated defeat pushes the dollar-economy ever closer to its black hole of unpayable debt, one wonders which inventions of abstract mathematics will allow the insurance men to offer policies against collapse of the system. The hilarious scene in Kubrick’s war room, with the wheelchair-genius calculating the underground survival of selected members of the human race and the five-star general screaming to the president about the dangers of a “mineshaft gap,” suddenly does not look so far away from these horizons. Except, of course, for the subversive humor.33
Our society’s obsession with controlling the future – and insuring accumulation – has at least two consequences. The first is the organization of a consumer environment for the immediate satisfaction of anticipated desires, with the effect of eliminating desire as such. In its place comes an atmosphere of suspended disbelief where entire populations move zombie-like and intellectually silent beneath exaggerated images of their unconscious drives. The second consequence, which we have seen with such violence in recent years, is the simple removal of those who might conceivably trouble this tranquilized landscape with any kind of disturbing presence or speech. What remains in the field of public politics is dampened voice, dulled curiosity and insignificant critique, sinking to a nadir in the period of national consensus over American military intervention after September 11.
In the face of these trends, which have been gathering since at least the 1980s, large swathes of the world’s population have reacted to the colonization of the future by seeking refuge in the distant past of revealed religion, giving rise to fundamentalisms, both Christian and Muslim, whose archaic vision of better days to come can only translate as a violent desire for apocalypse. Any number of national militaries, terrorist groups or guerrilla armies are willing to oblige, particularly in the historical lands of the Sacred Books, but also in places of deadly emptiness like Waco, Texas. The thing to realize is that the prophets of past and future go hand in hand. The computerized trader, the religious zealot, the military pilot and the suicidal terrorist are all protagonists in the “time wars” of the 21st century, whose coming Jeremy Rifkin predicted two decades ago, without being able to foresee the dramatis personae.34 As Maurizio Lazzarato has written more recently: “The West is horrified by the new Islamic subjectivities. But it helped to create this monster, using its most peaceful and seductive techniques. We are not confronted with remnants of traditional societies in need of further modernization, but with veritable cyborgs that articulate the most ancient and most modern.”35
In 1964, the year of Dr. Strangelove, Norbert Wiener tried to conjure away the threat of deterministic game theory, which he saw as a sure-fire path to “push-button war.” He thought that by placing flawless reason on a single continuum with the imperfect human mind and the limited electronic computer – or in other words, by understanding God and Golem to be “incorporated” within human experience – he could open up a more flexible ethical space, unbound to any ideology whether of religion or science. Yet today it is within this interface of God, man and machine that the Manichean games of corporate and military strategy are played, with few significant questions as to the rules, the stakes or the final causes. The cyborgs, like Kubrick’s strategic air commanders, have learned to stop worrying and love surveillance. But through the magic of computer media, their strange kind of love is now distributed much more widely through the population. The telos of humanity – its future map – once again looks like a bull’s eye of blind self-destruction.
The question isn’t one of dodging the magic bullet, or of constructing some fantasy space where you could survive unsurveilled. The question for artists, intellectuals and technologists is how to play a significant game, instead of reclining and declining in a gilded cage, as the PR and development wing for yet more corporate spin-offs of the mainline military devices. The question is how to engage in counter-behaviors, able to subvert the effects of cybernetic governance.
One thing we could do is to create more precise images and more evocative metaphors of the neoliberal art of government, in order to heighten awareness of the ways that intimate desire is predicted and manipulated. Such images and metaphors are desperately lacking, along with a Karl Marx of cybercapitalism. But another, more important thing we can do is to dig into the existential present and transform the everyday machines, by hacking them into unexpected shapes and configurations that can provide collaborative answers to the spaces of control. Critical communities of deviant subjectivity, forming at the site of the eviscerated private/public divide, are not subcultural frivolities but attempts to reinvent the very basis of the political. What’s at stake is the elaboration of different functional rules for our collective games, which in today’s society cannot be put into effect without the language of technology. Distributed infrastructure exists for such projects, in the form of open-source software. Laboratories for this kind of experimentation have been built ad hoc, by people such as Jaromil, Konrad Becker, Laurence Rassel, Natalie Jeremijenko, Critical Art Ensemble, Hackitectura, the Institute for Applied Autonomy, Marko Peljhan and hundreds of others. But what we don’t have is any sustained institutional commitment, any governmental Golems who are willing to wake up from their waking dreams. And that makes it very difficult to bring together, over the middle and long term, the diverse range of people who are needed to help change the culture of the present.
Social interaction is always a game of control, as David Lyon’s work on surveillance shows.36 But everything depends on who writes the rules, and even more, on how you play the game. To find a better way, or even to help raise the problem in its urgency and complexity, we would have to invent new kinds of cultural institutions able to take on more difficult and divisive issues – exactly the ones that the Manichean sciences of the postwar era succeeded in automating and hiding from view. Until artists, hackers and cultural critics are joined by scientists, sociologists, economists and philosophers with a purpose, there will be no deep and distributed critique of military neoliberalism, nor of the surveillance that articulates it. And in the absence of such an exorcism the ontology of the enemy will keep coming back to haunt us, like some undead ghost of the Cold War that never dissolved in the sun. This might even be the significance of the hilarious and supremely subversive ending that Kubrick gave to his film, when he has Vera Lynn’s optimistic, forties-era lyric come billowing up out of the mushroom clouds:
We’ll meet again
Don’t know where, don’t know when
But I know we´ll meet again some sunny day…
1Norbert Wiener, God & Golem, Inc.: A Comment on Certain Points where Cybernetics Impinges on Religion (Cambridge, Mass: MIT Press, 1966/1st ed. 1964), p. 17.
2On the Teleological Society, and on Wiener generally, see Steve Heims, John von Neumann and Norbert Weiner: From Mathematics to the Technologies of Life and Death (Cambridge, Mass: MIT Press, 1980), and Flo Conway and Jim Siegelman, Dark Hero of the Information Age: In Search of Norbert Wiener, the Father of Cybernetics (Cambridge, MA: Basic Books, 2005).
3Cf. Norbert Wiener, The Human Use of Human Beings (New York: Da Capo, 1954/1st ed. 1950), pp. 34-35: “The scientist is always working to discover the order and organization of the universe, and is thus playing a game against the arch enemy, disorganization. Is this devil Manichaean or Augustinian? Is it a contrary force opposed to order or is it the very absence of order itself?… The Manichaean devil is playing a game of poker against us and will readily resort to bluffing; which; as von Neumann explains in his Theory of Games, is intended not merely to enable us to win on a bluff, but also to prevent the other side from winning on the basis of a certainty that we will not bluff. Compared to this Manichaean being of refined malice, the Augustinian devil is stupid. He plays a difficult game, but he may be defeated by our intelligence as thoroughly as by a sprinkle of holy water.” Also see pp. 190-93 for explicit considerations on the Manichean nature of interstate politics, which Wiener considered “a bad atmosphere for science.”
4Cf. William Poundstone, Prisoner’s Dilemma: Jo hn von Neuman, Game Theory, and the Puzzle of the Bomb (New York: Anchor Books, 1992), p. 190, n. 3. But there are other models for Dr. Strangelove: Teller, von Braun, Kissinger and above all the game theorist Herbert Kahn, famous for “thinking the unthinkable.”
5Norbert Wiener, I Am a Mathematician (Cambridge, Mass.: MIT Press, 1956), pp. 251-52.
6Peter Galison, “The Ontology of the Enemy: Norbert Wiener and the Cybernetic Vision,” in Critical Inquiry 21/1 (Fall 1994), p. 238.
7Philip Mirowski, Machine Dreams: Economics Becomes a Cyborg Science (Cambridge University Press, 2001), p. 61.
8For an excellent discussion of dual-use technologies, see Jonathan D. Moreno, “DARPA on Your Mind,” in Mind Wars: Brain Research and National Defense (New York: Dana, 2006).
9See the product page at http://www.vsi-hmcs.com/pages_hmcs/02_jhm.html.
10For the origin of the word, see Manfred E. Clynes and Nathan S. Kline, “Cyborgs and Space,” in Astronautics (September 1960); facsimile at http://web.mit.edu/digitalapollo/Documents/Chapter1/cyborgs.pdf.
11John Stroud, “Psychological Moment in Perception,” in Heinz von Foerster, ed., Cybernetics: Circular Causal, and Feedback Mechanisms in Biological and Social Systems, Transcriptions of the Sixth Conference (New York: Josiah Macy Foundation, 1950), pp. 27-28.
12Corporate homepage at http://www.inferx.com.
13From a video interview with Brown on Dan Vernton’s Homeland Defense Week, at http://link.brightcove.com/services/link/bcpid1078673197/bclid1111449543....
14See the TANGRAM Proposer’s Information Packet, at http://www.fbo.gov/spg/USAF/AFMC/AFRLRRS/Reference-Number-BAA-06-04-IFKA... and the White Paper by Jesus Mena, “Modernizing the National Targeting System,” available in the “Expert Insight” section of the InferX site. The firm Allen Booz Hamilton, which won the general contract for the TANGRAM project, is located in McLean, Virginia, alongside Datamat and InferX; it is not clear whether InferX has actually been hired for the project.
15Acxiom corporate homepage at http://www.acxiom.com.
16Terry McAulife, quoted in the PBS documentary by Douglas Rushkoff, The Persuaders, 2004; the transcript can be accessed at http://www.pbs.org/wgbh/pages/frontline/shows/persuaders/etc/script.html.
17Oscar H. Gandy, The Panoptic Sort: A Political Economy of Personal Information (Boulder: Westview, 1993).
18ShopperTrak corporate homepage at http://www.shoppertrak.com.
19Jürgen Habermas, The Structural Transformation of the Public Sphere (Cambridge, Mass.: MIT Press, 1991/1st German ed. 1962).
20 For an insightful study of how the cockpit model has served for the retooling of public education in the US, see Douglas D. Noble, “Cockpit Cognition: Education, the Military and Cognitive Engineering,” in AI & Society 3 (1989). In conclusion Noble writes: “The means and ends of education are being reshaped within a massive military/industrial research and development enterprise, ongoing since World War II, to engineer appropriate human factors for high performance technological systems.”
21Sze Tsung Leong, “Ulterior Spaces,” in Chuihua Judy Chung et. al., eds., The Harvard Design School Guide to Shopping (Cologne: Taschen, 2001). Also see Stephen Graham, “Spaces of Surveillant-Simulation: New Technologies, Digital Representations, and Material Geographies,” in Environment and Planning D: Society and Space 16 (1998). Graham writes: “Computerised simulation and modelling systems now allow the vast quantities of data captured by automated surveillance systems to be fed directly into dynamic facsimiles of the time-space ‘reality’ of geographic territories (neighbourhoods, cities, regions, nations etc), which can then, in turn, be fed into support new types of social practices, organisational change, and urban and regional restructuring.”
22The French phrase dispositifs de sécurité could equally well be translated as “safety devices,”or even (catching the ambiguity that I will explore later on) as “safety-and-security devices.” The published translation speaks of “security apparatuses.” See the opening chapters of Michel Foucault, Security, Territory, Population: Lectures at the Collège de France, 1977-78 (Palgrave Macmillan, 2007/1st French ed. 2004). The following quotes are from pp. 62, 73, 48.
23Michel Foucault, The Birth of Biopolitics: Lectures at the Collège de France, 1978-79 (Palgrave Macmillan, 2008/1st French ed. 2004), p. 63; following quote on p. 260.
24The historian of technology Otto Mayr has documented the pervasiveness of simple feedback mechanisms (thermostats, governors) in liberal Britain during the eighteenth century, at a time when such devices remained rare among the authoritarian societies of the Continent. More importantly, he shows that these mechanical devices were commonly used as metaphors for such characteristic political-economy notions as supply-and-demand, checks-and-balances and self-regulation. However, Foucault never cites Mayr’s groundbreaking work of technical history, The Origins of Feedback Control (Cambridge, Mass., 1970). The more explicit comparative study only came later: Authority, Liberty and Automatic Machinery in Early Modern Europe (Baltimore: John Hopkins University Press, 1986). Cf. Galison’s discussion in “The Ontology of the Enemy,” op. cit. pp. 262-63.
25Michel Foucault, Security, Territory, Population, op. cit., p. 8.
26Daniel Bell, The Coming of Post-Industrial Society: A Venture in Social Forecasting (New York: Basic Books, 1999/1st ed. 1973. Bell writes: “The ‘design’ of a post-industrial society is a ‘game between persons’ in which an ‘intellectual technology,’ based on information, arises alongside of machine technology” (p. 116).
27William Bogard, The Simulation of Surveillance: Hypercontrol in telematic societies (Cambridge University Press, 1996), p. 55.
28See the interviews with Clotaire Rapaille in Douglas Rushkoff’s PBS documentary The Persuaders, op. cit.
29For an understanding of how this kind of economic shift occurs, see Shimshon Bichler and Jonathan Nitzan, “Dominant Capital and the New Wars,” in Journal of World Systems Research 10/ 2 (Spring 2004), available at http://bnarchives.yorku.ca.
30The TIA program was shut down by the US Congress, in part because of the outcry over the PAM interface; however, all of the information concerning PAM has been archived by its proud inventor, Robin Hanson, at http://hanson.gmu.edu/policyanalysismarket.html.
31From the reconstruction of the original TIA website at http://infowar.net/tia/www.darpa.mil/iao.
32The anthropologist and finance expert Paul Jorion, who detailed the mechanisms of the subprime crash over a year before it actually happened, quotes a remark from a specialized document emitted by the Union de Banques Suisses: “To analyze a simple CDO ‘squared’ constituted of 125 different securities… we would have to know the information pertaining to 9,375 securities.” http://www.pauljorion.com/blog/?p=174
33Apparently there is a little humor left in Washington, to judge from what seems to be the work of a Beltway insider. In July of 2007 a series of articles appeared by a certain “Herman Mindshaftgap” of “The Bland Corporation,” casting doubt on some of the numbers and concepts used by the administration war machine. Cf. “Why In Truth There Is No Surge,” in Counterpunch, July 13, 2007; avaliable at http://www.counterpunch.org/mindshaftgap07132007.html.
34Jeremy Rifkin, Time Wars: The Primary Conflict in Human History (New York: Holt, 1987).
35Maurizio Lazzarato, Les révolutions du capitalisme (Paris: Les empêcheurs de penser en rond, 2004), p. 101.
36See, among others, David Lyon, Surveillance Society: Monitoring Everyday Life (Buckingham: Open University Press, 2001); Surveillance Studies: An Overview (Cambridge: Polity Press, 2007).
published @ Continental Drift - http://brianholmes.wordpress.com/