The Age of Spiritual Machines: When Computers Exceed Human Intelligence
Author: Ray Kurzweil
Publisher: New York: Viking, 1999
Review Published: September 1999
Ray Kurzweil is Norbert Wiener in a positive mood. Artificial Intelligence expert and MIT graduate, Kurzweil is one of the great scientists, inventors, and visionaries of the second part of the twentieth century: he has spent a lifetime teaching computers how to act like human beings. He taught them to see, developing the first Charge Coupled Device (CCD) Flat Bed Scanner in 1975. He taught them to read, creating the first Omni-Font ("any" font) Optical Character Recognition software in 1976. He taught them to listen, with the first commercially marketed Large Vocabulary Speech Recognition Software in 1987. He taught them to speak, with the first print-to-speech reading machine for the blind in 1976.
In 1990, Kurzweil shook the world of computer science with the publication of The Age of Intelligent Machines. What is going to happen, Kurzweil asked then, when computers go beyond mere input/output formulas and begin to actually think for themselves? They will clobber the world chess champion by 1998, he predicted (this happened in 1997), and people will be able to visually navigate a global network of interconnected computers, he said (five years before the World Wide Web). Lauded as a visionary, the recipient of nine honorary doctorate degrees, honored by two U.S. Presidents and foreign dignitaries, this "restless genius" that is Ray Kurzweil has dropped another bomb on the scientific community by asking: "What happens when machines exceed human intelligence in every measurable way?"
The answer: We enter The Age of Spiritual Machines, machines that not only see and feel and speak and think, but machines that surpass human intelligence -- machines that have consciousness, their own agendas, and the ability to achieve their goals without human assistance. What will happen to us when evolution replaces us as the dominant species on the planet?
A world where the difference between man and machine blurs, where the line between humanity and technology fades, and where the body, the soul and the silicon chip unite. This is not entirely science fiction. Kurzweil guides us, in a rather peculiar way, through the advances that will result in computers exceeding the memory capacity and computational ability of the human brain. In his view, the computer, now dubbed "spiritual machine" and taken as a metaphor for pervasive reality of highly integrated form of life, constitutes a challenge for all the social actors: scientists and producers, technologists, and users are all faced with the "architecture of complexity" induced by fundamental changes and unprecedented novelties that go beyond the conventional theoretical tools and methods of understanding the growth of technology. In this new book, Kurzweil tells us what to expect from the next twenty years of development of "intelligent and autonomous machines" : if he is right (and he might very well be), we've only got until about 2020 before computers outpace the human brain in computational power. According to Kurzweil, machines will achieve this "goal" by 2020. By then, we will begin to have relationships with automated personalities and use them as teachers, companions, and lovers. In 2030, ten years later, information will be fed straight into our brains along direct neural pathways; computers, for their part, will have read all the world's literature.
The distinction between us and computers will have become sufficiently blurred; when the machines claim to be conscious, we will believe them. But beware: this argument is not new. It is actually co-substantial with the growth of computers, and can be traced back to Norbert Wiener andJohn Von Neumann. What is new is Kurzweil's aim in finding a way to "reverse engineer" the human brain so that we can download everything about ourselves -- our memories, our dreams, our personalities -- into a computer, a process he calls "reinstantiation." It is too bad this part of the book is not developed enough, since it is certainly the most compelling and promising.
Kurzweil's new book is then both a logical follow-up of his previous and current technological and entrepreneurial endeavours and offers a lot of insights and interesting reflections into the future of the integration of computers. But, unhappily, it is also very chatty, fairly non-linear (not to mention rather messy and uneven). The very extensive and impressive bibliography (both classical and Web-based) seems rather disconnected from the main body of the text. Kurzweil does not engage all that much in discussions with other scientists and authors but rather he prefers to discuss and confront mainly the theories and achievements of . . . Ray Kurzweil. All this is rather acceptable, even though it would have been much better if tempered by a larger dose of doubt and self-questioning. After reading Kurzweil's book, what follows is a schematic outline of five problematic areas in which an "ecology of the spiritual machines" could be developed.
Symbols Without Decision
The technology of information and artificial intelligence now makes it increasingly possible to formalize the knowledge and ability required for any one job, or rather any class of work, in favoring the integration of a program which can command a machine or a series of machines. But the importance of this change does not seem to have been fully appreciated. The more activities are planned in advanced and contained in information processes, the less need there is for a decisional system at every single level of work, whether in its formalization or in its execution. Moreover, the use of information machines entails an increase in new symbols, which are not only difficult to learn but which also call for a special effort to be correctly attributed to what they conventionally designate. The problem lies in the fact that for the moment we possess new symbols but not a new language, which could come about only if, in the designing of information machines, we take into consideration new behavioral and ecological patterns and new collective cultures. Work, then, despite its technological possibilities, becomes a binary type activity, with stimuli-reactions and information-rules to apply, which are already contained in the information program. At all levels of power, responsibility, and integration of technological system, managers, technicians and employees find themselves occupied with the control of work in which decision-making is less and less in evidence.
Abstraction and Solitude
Information technology makes most operations and gestures in work abstract and immaterial, both operations springing from the technological revolution and any other connected with it. Thus, symbols, figures and languages of a "varied nature" become the essential mediating factor between workers, work, previous knowledge, and the very "community" of work. Besides distinctions in category and other differences (in salary, company status, or career) the center of gravity in all working practices is shifting towards a series of functions which call for intense mental activity, the actual cognitive mediation of work and of its social and organizational context. Moreover, the stereotypical image of an intelligent, creative and rewarding activity, on the one hand, and of a repetitive and intellectually boring job, on the other, is now giving way to a vision which is the hybrid product of the information revolution: despite job status and cultural differences, there now exists a series of tasks, both inside and outside the office, which are marked by the same standards, which use the same symbolic mediation, and which create the same sense of loss of identity in dealing with the intelligent processes of the machine.
The Involuntary and Paradoxical Recomposition of Work
Today the machine and work procedures require not only a certain amount of mental commitment, which may vary with the complexity of the machines and the operator's knowledge and experience. A new component in the work load seems to have been added -- the "cognitive-organizational load," that is, the component dealing with the effects of the variables that define the organization of social relations and work. At the center of an ever-increasing number of activities we find the processing, checking, and sometimes the analysis of the symbolic data and mediating information produced by information-based systems. The borderline between jobs and their respective cultures now tends to disappear, giving place to a much vaster group of activities in which the work is carried out in similar conditions, with processes of the same kind, content and intelligibility, and above all in similar organizational contexts. This "recomposition of work," often presented as one of the promises of automation, carries with it however an unforeseen consequence: one's perception of the conditions of work and of the weight of the organization -- once its "mechanical" and material side disappears -- vanishes from one's immediate view, while the abstraction inherent in the new conditions of work alters one's psychological sense of the work itself. With the new information technologies the work may indeed recompose, but the meaning of each activity becomes murkier and more inaccessible both for individuals and for the organisation.
Cognitive Pressure and Accelerating Tempos
Despite the widespread belief underlying much research into the social consequences of automation, the process of symbolic abstraction and mediation of work is not an "unexpected consequence" but an intrinsic element in information technology. The alteration of the experience, contents and finality of a job takes place independently of the way in which the office system is conceived, planned and introduced. The most dramatic example, perhaps, is the rapidity of access to information and the speed of its processing, which make possible a considerable increase in the number of operations and lead to an intensification of the work tempo. Given the formalization and abstraction of the work processes inside programs, the employees are now more isolated and they often find it impossible to ask a colleague for advice or information.
Modifications in the Social and Emotional Life of Humans
A kind of silent revolution is now producing profound changes in the social life of our communities as it affects the identity of individuals, organizations and groups. One of the fundamental bases of the ideology of modernity, therefore, may become nothing but an empty myth: the community, the group, which is still absolutely necessary for the efficient running of social organization, is being detached from the technological base on all human interactions rest. Take the example of the most central human activity, work. The form of collective organization inherited from a developmental phase based on the need to gather in the same space management, machinery and workers, has now been basically altered by the possibility of office automation. What will happen in the workplace (not to mention of the workplace), then, if for example the employees no longer need or have no chance to intermingle at the workplace, or if they no longer have any control of the deeper significance of both individual and collective work? In such a context the new problems have shifted away from the positive or negative myths of automation and focus on how to plan and manage an organization and work which have become abstract for all three of the most important levels: the individual-cognitive, social and managerial.
In God and Golem, Norbert Wiener, concerned with the moral and technical predicaments of automation with respect to both cybernetic technique, of which he discovered and promoted, and the social aftermath of this technique, started by saying that, in the advanced industrial societies, automated machines are more ecologically and mentally hazardous than the ones of the industrial past, because they have pervaded the fields of techniques and communication, and the way the human mind works: "I find myself facing a public which has formed its attitude toward the machine on the basis of an imperfect understanding of the structure and mode of operation of modern machines. It is my thesis that machines can and do transcend some of the limitations of their designers and that in doing so they may be both effective and dangerous . . . [If, as it is, the case] Machines act far more rapidly than human beings . . . even when machines do not in any way transcend man's intelligence, they very well may, and often do, transcend man in the performance of tasks. An intelligent understanding of their mode of performance may be delayed until long after the task which they have been set has been completed."
Paradoxically, it seems to me that the foundations of my remarks were already laid down more than a quarter of a century ago; this notwistanding dispassionate assessment seems no more possible now than it was in the 1960s, not because we do not know enough but because we know too much, and because more than ever, polarization between technological optimists and ecological and sociological pessimists has become even more pronounced than twenty five years ago.
One of the most considerable qualities of the literature presenting the opposite judgements and opinions of the social sciences on ecology, on the one hand, and technology and automation, on the other, if there are any, is the variety of totally conflicting findings reported by various authors and schools. Technology is described both as a great liberating agent and as a device which will reduce human freedom. The changes in tasks, job profiles, relations among employees, the way the work is supervised, career paths and structure of the organizations, and again about the role of management and of "designers" -- all these finding contrast dramatically according to which authors one reads, bestowing upon the reader a growing confusion, reflected both by the vagueness of the terminology employed in order to define the "Society of the Future" and its nature, structure and purposes. Nonetheless "new horizons" appear after these "great debates" on the nature and the consequences of technology settle down: the role, both real and fantasmatic, of "ecology" is already increasing dramatically and in the next few years will be expanding its importance, requiring some substantial mutations also in the nature, the scope and the aim of ecology.
If one of the main characteristics of the new technologies is their pervasiveness and their adaptability, then office work -- as defined earlier -- becomes the dominant and structural form of work in post-industrial societies and risk society. Fast technological change from human, or machine, mediated work to computer-mediated work leads to changes in how workers adapt to their particular work environment. Simultaneously, with these technological shifts even more important societal and cultural transformations appear, reflecting changes from a society based on manufacturing and producing material goods to a society based on service economy, i.e. non-material goods (from industrial to postindustrial to "largely symbolic" and "risk-centered" societies). Under these new conditions, ecology becomes suddenly a more complex and multidisciplinary operation than it has been in the recent past. Time and technology become now two of the central foci of the new science of ecology, arising from interdisciplinary research, with contributions stemming from many different fields: at its center and as one of his goals, the management of the architecture of complexity which is the most important and vulnerable element of a society increasingly dominated by computer-mediated work, and immateriality. Take, just as a token example, the very emblematic case of the Three-Mile Island accident: since then, all nuclear power plants are provided with a "Technical Support Center", where experts in very different areas (from plant designers to experts in the management of environmental disasters) meet and work together with the control room operators in the case of a serious anomaly in the functioning of the plant. These people hardly ever see each other (except during infrequent training sessions) before an accident and must reach a high level of coordination in a very short time. It is obvious that in this case social competence and the "cognitive-organizational load" play a crucial role. Naturally, not all unfamiliar events in the field of automation reach these levels of risk, nor are they all so emotionally demanding. Nonetheless, there are many analogous situations, and their number is growing steadily.
In conclusion, for both social science and ecology, the real problem is much larger than making workplaces more productive or humans more "connected." The real problem is to decide right now what human qualities and freedoms are worth fighting for and which ones are not. Because when we are through with this digital transformation there won't be a stone left standing on another stone of our social edifice. There may still be a future, but there won't be a place to put it.
Marco Diani is a Professor and Senior Research Fellow at the Centre National de la Recherche Scientifique (CNRS) in Chambery, France. He is the author of four books, including The Immaterial Society (Englewood Cliffs, NJ: Prentice Hall, 1992) and, with Catherine Ingraham, Restructuring Architectural Theory (Evanston, IL: Northwestern University Press, 1989). During the last two decades, he has served as a Visiting Professor at a dozen institutions including Massachusetts Institute of Technology, Harvard University, the University of Chicago, Northwestern University, and the Istituto di Psicologia in Rome. Please direct correspondence to: Marco Diani, Senior Research Fellow at CNRS; ERA-European Research Agency; 180, rue du Genevois F-73000 Chambery France. <firstname.lastname@example.org>
|HOME INTRO REVIEWS COURSES EVENTS LINKS ABOUT|
|©1996-2007 RCCS ONLINE SINCE: 1996 SITE LAST UPDATED: 12.10.2009|