[ Main Contents
| Chapter 1 Contents ]
Revised April, 1999
Part One. Information Technology and Society
Chapter 1. Basic Concepts
To understand the way computers and communications technologies
are contributing to the transformation of the postmodern world,
we must understand the social context in which they
are created and used. In order to do this, we will have to begin
with some basic concepts of information, society, and technology.
Information is not just one thing. It means different things to those who expound its characteristics, properties, elements, techniques, functions, dimensions, and connections. Evidently, there should be something that all the things called information have in common, but it is not easy to find out whether it is much more than the name. If we have failed and are still at sea, it may be our fault: Explorers do not always succeed in learning the language of the natives and their habits of thought (Machlup and Mansfield, 1983:4-5 ref ).In the following discussion, that something called information is defined in a way that is useful for understanding the social consequences of computers and communications technology.
A computerized phone book (such as those used by directory assistance operators) displays a page when a person's last name is typed in. When the operator selects the desired number, the computer plays a recording of the digits. Information is transmitted to the caller, but no new information is produced. In order to make information, new connections must be made among data. This can be done by physically rearranging the data, as when a person or a computer sorts records into some kind of meaningful order. As another example, if we put data on people's smoking habits together with data on lung cancer, we can produce information about the risks of smoking.
The expression garbage in; garbage out applies to computerized information in several ways. If the data are not valid, no amount of careful organization can make them represent reality. If the data are valid, but our arrangement is at fault, then we will not have accurate information. Finally, if we have valid data organized to provide excellent information about things of no interest or value to us, we have not contributed positively to our knowledge.
this dog (a particular object) all dogs (a class of similar objects) the word dog (a symbol representing a set of properties that defines a class of objects) concrete nouns (a set of words defining properties of classes of objects)High-level computer languages allow programmers to refer to complex sequences of machine instructions or arrangements of data with commands that make sense in human conceptual terms. Advanced information processing systems allow users to create useful structures of relationships among different levels of information. At higher levels of abstraction much of the detail of data is lost, but important information is preserved. A road map, for example, preserves relationships among routes and distances between towns. It does not contain all the detail of an aerial photograph, but is much easier to read.
Another form of information hierarchy is a metalanguage, such as SGML (Standard Generalized Markup Language http://www.sil.org/sgml/sgml.html ). Metalanguages provide a set of standards for the formal description of other languages. SGML defines a descriptive markup language for electronic documents. HTML (HyperText Markup Language) is an application of SGML used to write web pages like this one. For an example, use the view file source option on your browser to look at this page.
Hypertext and hypermedia were data base architectures of growing importance in the 1990's. These are organized to connect text, graphical images, film, and sound data in ways that permit computer users to navigate through complex networks of information. An example of hypertext is the Electric Cadaver data base in use at Stanford University's School of Medicine. Students can browse through x-rays and slides, look up related text, and even simulate the manipulation of body parts.
The explosive growth of the World Wide Web in the late 1990's has created the potential for linking most of the world's information. It represents a serious challenge to the designers of search engines and data bases to move beyond keyword and page metaphors for information storage and retrieval. The web also represents a social challenge to our ways of creating, sharing, and controlling information.
For all the history of griefIn his capacity as Librarian of Congress, Archibald MacLeish expanded its collections to include film records of the Great Depression. These photographs of impoverished farm families, some of which can be seen in James Agee and Walker Evans' Let Us Now Praise Famous Men (1960), are powerful examples of visual symbols.
An empty doorway and a maple leaf(MacLeish, 1962:51ref. Full text text )
Computers can now make it easier for people to preserve culturally important
images. Work in the field of computer vision has expanded the capacity of computers to
process video input. Developments in pattern recognition programs can now
recognize letters, numbers, geometrical shapes, and even fingerprints. But computers
have not been very successful at
handling abstract symbols, especially those referring to the emotional
qualities of human experience.
22.214.171.124 Sacred and Secular Knowledge Historically, our culture has
conceptually divided reality into the realms of the secular and the sacred.
The secular is the ordinary reality of science and everyday life; the
sacred is the realm of religion, magic, and the supernatural. As
scientific understanding developed, the domain of the secular expanded to
include first astronomy and physics, then chemistry and medicine. With
each expansion, there was a social struggle to replace mysterious
explanations with rational ones.
During the history of scientific knowledge, the secular and the sacred were not so differentiated. Myths traditionally explained the relationship of human beings to time, life, death, and the sacred (Campbell, 19**). According to Giorgio de Santillana and Hertha von Dechend (1969), myths were the form in which our earliest scientific knowledge was passed from generation to generation. Although a myth has the form of a story, it is a high-level symbolic expression of the workings of the universe.
Myths today still have the power to explain the human condition in
symbolic terms. An example is the Greek myth of Prometheus, who was
punished for bringing technology to humanity. Although we do not believe
the story to be true in the scientific sense, the Promethean theme has been
used by scholars to symbolize the unforseen consequences of technological
change. In applying it to computers, Patricia Warrick (1980) argues that
the myth is a warning that nature has placed limits on humanity's ability
to create and control.
126.96.36.199 The Two Cultures
C.P. Snow (1963) introduced the term the two
cultures to describe the twentieth century split between people seeking
knowledge through scientific inquiry and those interested in knowledge of
religion, the arts, and the humanities. Although cultural information is
not neatly divided between the two, there can be misunderstanding between
technically and humanistically oriented people, as if they came from
different societies rather than sharing the same culture. Humanistic
knowledge is a way of understanding what the world means. It is a source
of wisdom based on our society's whole range of experience. Scientific
knowledge is an understanding of how the world works, validated through a
careful process of experimentation. We expect it to change over time.
Although cultural symbols also change, we tend to experience meaning as
absolute truth validated by inner faith.
Although no one can know their entire culture, people who learn only
one way of understanding are ignoring an entire dimension of the world. If
they knew only how things worked, their lives would be without meaning. If
they were entirely ignorant of technological culture, they would find
modern society full of mysterious phenomena and incomprehensible machines.
When the two cultures are merged, we find people who are intrigued by both
the how and the why of the universe. Some modern scientists and poets have
bridged the gap between the two cultures (Hoare, 1987). Today in the
computer field there are philosophers, musicians and historians. There are
also a growing number of applications of computers to the work of writers,
artists, and performers. Computer science, by adding a new dimension to the
question "What does it mean to think?" is making a new contribution to an
age old philosophical question as well as expanding the realm of scientific
188.8.131.52 Computers and Cultural Values
The selection criteria that define what
information is relevant to human purposes are part of our culture's values.
Values are the oughts and shoulds of society. American values include
respect for individuals, freedom of speech, property, and equal opportunity
for all. We also value things like automobiles, health, money, and fresh
air. Sometimes, as in the case that we ought to be able to drive
automobiles and we ought to be able to breathe unpolluted air, cultural
values are contradictory.
Cultural analysts agree that computers themselves are highly valued in American society, and that this evaluation will have consequences for the rest of culture. Sherry Turkle (1984) predicts that the experience of using computers will cause us to devalue calculation and logical reasoning. In other words, the ability to calculate and reason logically will become less important for people as it is done more and more by machines. Instead, she finds computer users placing higher value on emotion and feelings to define what it means to be human. Daniel Bell (1980b) believes that information will become more highly valued, with the ability to use it becoming our most important skill. Joseph Weizenbaum (1976) has suggested that computer-based data will become so important that we will neglect our cultural traditions and fail to explore new nontechnological areas of human experience. Echoing the theme of the Promethean myth, he fears our fascination with the power of the computer to let us design and control imaginary worlds will lead us to tragedy in the real world of social cooperation and conflict.
Information measurement in the computer field owes a great deal to Norbert Weiner (1948) and Claude Shannon (1948). They defined the quantity of information in a system as a statistical measure of its organization. Shannon defined information as the probability of a message being transmitted. Improbable signals contain more information than highly probable ones. For example, if someone tells you something they've told you many times before (and that you expect them to keep on telling you over and over), there's not much information in the message. From this perspective, a new artistic expression has more information than a repetition of a traditional cultural form.
Shannon's approach to measuring information is now most commonly used in the fields of telecommunications and electronics. The signal to noise ratio can be used as a measure of information. Noise is the highly probable, randomly-generated part of the transmission (for example, the static during a telephone conversation). The signal is the non-random, information-bearing part of the transmission (for example, the voice you are listening to in a noisy room). In studies of information transmission, the focus is on the speed and accuracy with which data can be communicated through a variety of electronic media. This approach is invaluable for the development of computer hardware and communications software. It does not, however, really consider the meaning of information to humans. Thus linguists have criticized Shannon's definition by pointing out that a sentence like "Fred is a dog" contains more information than the less frequently heard sentence "Fred is a mammal", where Shannon's theory predicts it should contain less information. However, his theory was intended to deal with message transmission, not with symbolic meaning. Nor is it suggested in Shannon's theory that more data transmission through computer networks will automatically add to our cultural knowledge.
When we pick up a room, we scan the situation, locating objects in space and comparing their distribution to our mental pattern for "clean room". We select each object that is out of place and put it where it belongs. As we identify, select, and relocate objects, we are using information to identify objects and feedback to observe and control our cleaning activity. As we work, the arrangement of objects in the room gets closer to our mental goal. To appreciate the way you use information to create order, try cleaning a room in the dark. Unless you are blind and used to identifying objects by touch and sound, you may find it quite difficult in the absence of visual information.
Computer novices often have similar problem keeping track of their files. Without the visual feedback they are used to from books and papers, they have trouble imagining "where things are" in the computer. Even experienced programmers find it helpful to draw "pictures" of their data structures. This illustrates the indispensability of mental concepts. We cannot create order unless we have in our minds a set of criteria for selecting and arranging the objects we are trying to organize. These non- random mental criteria for identification, selection, and action are themselves a form of information stored in the biochemical processes of our brains. In the computer, they can be made part of information processing software or hardware. In writing software or building hardware, they are an essential element of design. The entropy concept also underlies the need for computer hardware and software maintenance. Computer systems will become disorganized unless we continue to use energy and information to keep them functioning properly. Although maintenance jobs are sometimes viewed as unexciting, they are a large part of computer system costs and an essential ingredient in their success (Couger, 1985).
A social fact is a cultural phenomenon that has consequences for human behavior. Cultural values, for example, are social facts. So is the common sense wisdom of "what everybody knows". Social facts may be true in the scientific sense, for example if students majored in computer science rather than history because they believed that starting salaries were lower for historians. Often, however, social facts are not based on accurate knowledge. As an example, so many people believed that AIDS could be spread by casual contact that individuals with the disease were fired, evicted, and banned from school. Our scientific information indicates that AIDS is spread only through sexual or direct blood product contact, yet the social fact of erroneous medical knowledge produced very real patterns of fear and discrimination. When we explain something using social facts we refer to the way human society is organized and what people believe, rather than looking at physical phenomena.
Although the astronomers and navigators of Columbus' time had good scientific evidence that the earth was a globe, his expedition suffered from public fear that ships would fall off the edge of a flat earth. Among the social facts about computers are:
Besides the ethical and legal problems involved in the above example, it should be clear that the cost of making an information product is not proportional to its size, but depends upon how much effort and expense is involved in locating, selecting, and arranging the data. Once made, an information product can be copied at little cost. Size is usually directly related to the costs of reproducing information products, and it is often exponentially related to the time it takes to search for information in a library or data base. It is not, however, a good measure of the original cost of making the product.
The value of information products to consumers depends upon what they want to know and how difficult it would be for them to get the information elsewhere. Because people expect some kinds of information to be freely available, they may resist having to pay for it. If the costs of information products is high, people will be tempted to copy them, feeling that "stealing" information doesn't really harm the original. As we will see in Chapter 8, the difficulties with protecting information property present a new challenge to our legal system.
The field of computer science known as artificial intelligence, or AI, involves the design of computer programs and automated equipment, such as industrial robots, with a limited capacity to behave in ways that at least resemble human thought processes (for a technical survey, see Barr and Feigenbaum, 1982, Hayes-Roth, 1983, or Coombs, 1984; for a sympathetic popular history, see McCorduck, 1979). Information from the outside world can be sought, interpreted, and used as the basis for "heuristic" decisions which in humans would be called "best guesses." The programs can, within the narrow range of the world to which they are applied, draw inferences, suggest solutions to previously unsolved problems, select relevant information according to their own internal criteria, and modify their own behavior as a result of the outcomes of their previous actions.
Automated programming, industrial planning by machine, and mechanization of the professions were topics on the agenda of a 1958 international conference on the emerging field of artificial intelligence (National Physical Laboratory, 1959). In addition to saving labor, managerial control and profitablity were among the reasons advanced for why AI should be supported. During the next twenty-five years, artificial intelligence was transformed from academic research projects to widely publicized commercial applications (Feigenbaum and McCorduck, 1983; Hayes- Roth, 1984).
Knowledge is a scarce resource whose refinement and reproduction creates wealth. Traditionally the transmission of knowledge from human expert to trainee has required education and internship years long. Extracting knowledge from humans and putting it in compatible forms can greatly reduce the costs of knowledge reproduction and exploitation...skill means having the right knowledge and using it effectively. Knowledge engineering addresses the problem of building skilled computer systems, aimed first extracting the expert's knowledge and then organizing it in an effective implementation (Hayes-Roth, Waterman, and Lenat, 1983:5,13)The theoretical possibility of representing human knowledge and decision-making processes in computer programs has been fiercely debated on both scientific and moral grounds, with the strongest objections coming from the philosopher Hubert Dreyfus in What Computers Can't Do (1972) and the artificial intelligence expert Joseph Weizenbaum in Computer Power and Human Reason (1976). One important issue is the degree to which human decision-making is believed to be rational and logical. Intelligent software has been most successful or those applications in which the knowledge of human experts is very well understood and rather routine. Critics of knowledge engineering doubt that computers can actually be designed to handle any but the simplest symbolic meanings.
While the debate between those who argue that machines can think and those who argue that they can't continues (Boden, 1977; Haugeland, 1981), the practical success of "intelligent" programs which play chess, infer chemical structures from molecular data, and diagnose illnesses indicate quite clearly that artificial intelligence is being "put to work" at industrial and professional tasks, despite the reservations of many theorists.
The most ambitious practical proposals of the 1980's involving expert systems were those for the new 5th generation "supercomputers" (Feigenbaum and McCorduck, 1983; "Supercomputers: The High-Stakes Race To Build a Machine that Thinks," 1983). Promising higher industrial productivity and greater national security, the proposals called for many areas of military and civilian expert decision-making to be turned over to the faster, soon-to-be smarter machines. In his critique of the fifth-generation idea, Weizenbaum questions Feigenbaum's assertion that computers will produce the future knowledge of the world, asking how are we to understand just what information the computer produces and how (Weizenbaum, 1983). But if information itself is seen as a product made for profit by efficiently organized employees, then information can be produced by the computer in the same way that products were made by the factory machinery of the first industrial revolution.
The development of computer programs called intelligent agents represent a use of expert systems to empower information seekers. Much of their use by corporations is in the areas of marketing and entertainment. It remains to be seen how these programs will be applied in the long run to the work of librarians and other information experts. Intelligent information processing can mean the automation of intellectual work as well as the enhancement of information retrieval.
Matter, energy or information may flow through a system. System processes describe the way components act on one another and on the material flowing through. In the above example, energy flows through the system; the bulb processes electricity to produce light. The function of a system is a description of what it does; our example provides light. The function of a component is what it does within the system; the switch controls the flow of electricity and "remembers" if the light is on or off.
The state of a system is one possible arrangement of parts, with each part in a particular condition. For example, the bulb, switch and battery system has six states:
Once we understand the functions of a system, we can begin to predict its behavior, as when we expect that the bulb will light when we close the switch. However, even in the simplest of systems like the one above, we cannot predict all of its behavior (such as the conditions under which we can expect short circuits and burnt out bulbs). This is because our conceptual model oversimplifies the system we are observing and fails to take "everything" into account. This is especially so when we have ignored inputs to our system from the "outside" (how did the battery get charged?) or fail to understand lower-level processes (how does the electricity come from the battery and move in the wires?). Dead batteries and short circuits are only understandable if we know more about the situation than we have modeled here.
In large systems there are so many possible states and transitions between them that we cannot predict, except in probabilistic terms. For some large systems that can be formulated mathematically, we may build computer simulations or solve mathematical equations to make predictions. For less well-defined systems, like weather patterns or societies, the mathematics of systems analysis is very difficult to apply. For these, the concept of a system remains a useful aid to thinking, but does not often provide a method of quantitative analysis.
The social constructionist area of (symbolic interaction theory) describes and predicts how social structures are created out of the interactions of individuals acting within the social system. This is in contrast to other theories which see human behavior as determined by the external constraints of their social environments.
The exercise of power requires information. We cannot influence people unless we can communicate with them. We cannot offer them material rewards to do our bidding unless we can come to an understanding about the exchange. We cannot even forcibly move people or objects without knowledge of where they are vulnerable to our efforts. Planning long-term actions requires procedures to gather new information, evaluate it in terms of shared goals, and use it in choosing a course of action.
Although coercive power, or force, occurs in social interactions where the will of one person or group is imposed on the unwilling, most power in interactions is of other sorts. We may be influenced by others because we like or respect them. Or we may do what they ask because we think it is legitimate (right or legal) for them to give us orders. This is called normative power, named after norms (the unwritten -- often even unspoken -- rules for how to behave in specific situations). Wearing clothes in public, pausing in a conversation so that someone else may speak, and not eating one another are all examples of norms. We tend to think of norms as human nature, but children have to be taught to dress, not to interrupt, and not to bite. Norms are based on the more general moral and ethical principles, cultural values.
Computers are sources of power for those who use them to manage information. In some cases individuals or groups can use computers to increase their power at the expense of others. In other cases, the use of computers can make it easier for people to negotiate and reach decisions. The uses to which computer power can be put is the subject of the remaining sections of this book, especially in the concluding chapter where their effect on social decision-making is explored.
Some norms for roles are formal rules. In the baseball example: "Third base players may not pitch to the batter." Others, such as: "Shortstops cover second base when the second base player goes after an infield ground ball," are informal rules. In the case of formal rules, special social positions often exist (for example baseball umpires) to make judgements and enforce expectations. In the case of informal rules, people apply social pressures (for example, dirty looks, praise, or a shove) to keep others behaving properly. Norms are essential to cooperative human activity. Social interactions to enforce norms are one part of the process of social control. Cooperative forms of social control (such as making sure a church congregation behaves reverently) are generally based on symbolic communication more than on force or economic power.
Institutions like business, government, and the military make up the economic and political structure of society. Law, government, and the other institutions supporting a democratic political process in society perform several functions. They are the way we make decisions affecting all of us, the way we allocate our public resource, and the way we establish official agents of social control. As discussed in Part Four of this book, these institutions are changing as we introduce new information technologies.
Economic institutions produce and distribute society's material goods and services. Computers and communications technology is being used to redefine the tasks expected of employees. Computer applications have begun to alter business management, product design and marketing, and financial record keeping. Computer technology is also being used to alter the basis of our economy -- property.
Computers affect property relationships in two ways. First, information production is changing the kind of industries we have and the sorts of jobs available. Because some individuals and companies are better able to take advantage of these new economic opportunities, there will be some changes in society's distribution of wealth. The issue of how information is to be used is a second way computers affect property relationships. The democratic social values of privacy and freedom of information are often in direct conflict with our concepts of personal and corporate intellectual property.
In societies where individuals can choose their jobs, their religious and other group memberships, and can raise or lower their social rank through education and effort, a person's place in the stratification system is only partly inherited. During the social mobility process people rise or fall from the status they received at birth. Where mobility is possible, the institutions of family and education are where people acquire the skill and training to be "successful" or are judged "failures".
The computerization of work will probably be the major mechanism by which the computer alters social stratification. Because so much of a person's social status in modern societies depends upon his or her occupation, changes in the types of work people do (especially if there are corresponding changes in wages and salaries), can drastically change the stratification system. If many new jobs are created at the "top" of the social structure, more individuals will have opportunities for success and status. If, on the other hand, new jobs are created at the low-wage "bottom", it will be more difficult for individuals to gain social status. If computers are seen as appropriate for use mainly by men, status opportunities for women could be restricted. If educational institutions provide computer science education mostly to middle class children, poor and minority children could experience even greater barriers to occupational success.
The purpose of tools can be as specific as the zax (used for punching holes in slate roof tiles) or as general as a rope (with thousands of uses, from walking a dog to putting up a flag). Tools are often used as extensions of the human body to gather information about and to manipulate the physical world. Microscopes and telescopes extend our vision; hammers and space probes extend the reach of our hands. Tools like cameras or tape recorders store sensory information; tools for writing and painting allow us to make a durable record of our inner ideas and visions that can be shared with others. Information storage media, from stone carvings to data bases, facilitate the communication of information from person to person and from generation to generation.
Computers can be very specific tools (for example, to regulate an engine's performance) or very general-purpose tools such as the programmable digital computer. Although some people still consider the computer to be useful only for computation, computers are tools for communication and control of all types of information. Analog computers handle non-digital processes (like monitoring an electric current or the temperature of a room); multimedia capabilities enable us to process visual and audio information; peripheral devices such as remote sensors can process air pressure, chemical composition of the atmosphere, and a host of other data.
As an information processing tool, the computer's major characteristic is the speed with which it processes extremely large quantities of data organized in complex ways. Although computers are popularly noted for their perfect accuracy, all large and interesting computer systems are prone to error. Hardware and software bugs, human errors in data entry, and the built-in possibilities for less than perfect performance (such as the ability to "guess" or "forget" that is a feature of the heuristic programs used in artificial intelligence) mean that computer technology is not the way to perfection. For many applications, however, computers offer more efficient means of performing tasks than previous methods.
The control over geographically dispersed information is an extremely important feature in business and military applications, as well as in the communications industry. Computer technology provides us with remote controlled extensions of our bodies. Remote sensors used in satellites and space probes extend our ability to gather information on subjects as diverse as the vegetation of Africa or the rocks of Mars. With telecommunications equipment we can hear from any part of the earth and far into space. Via robotics, we can work from a safe distance on the ocean floor or with hazardous chemicals. Also, and more dangerously, computerized weapons have vastly extended our ability to throw deadly objects at one another.
Computer-based decision-making is at the heart of the integrated software systems now being designed for industrial and military uses. These systems coordinate decisions from the purchase of raw materials through automated plant operation to customer billing. Although the expression "Computers only do what you tell them to do." has become almost a folk saying, decision-making by computer is becoming increasingly sophisticated.
Perhaps the most striking characteristic of the computer is its extension of the human mind; both our memories and our abilities to calculate and reorganize information have been enhanced by computers. Edward Feigenbaum (reference) believes that we will enter into a "partnership" in which computers perform calculation and memory functions, while humans exercise their analytic capacities. The danger in this, expressed by Joseph Weizenbaum (reference), is that we will neglect those areas of human judgement and reason which cannot easily be computed. He fears that the new doors opened by computer extensions of the mind will close other, more important doors of human thinking.
Some people learn to use computers in ritual ways. Without necessarily understanding what they are doing, they go through a sequence of steps to make a computer "magically" respond. Computer technology appears to them as one of the mysterious forces of the universe. Although it is still possible to transmit technique through ritual, there are more effective ways to learn to use a computer. Also, these private computer user rituals lack the social dimension of shared meaning that make public ritual a continuing element of human culture.
Popular explanations of creativity often equate it with the free expression of unconscious impulses, with mysticism, or even with insanity (Becker, 1978 ref). Misunderstandings of brain function lead some people to erroneously assume that creative people use the right half of their brain (the part that usually controls the left side of the body) while analytical people are "left-brained" (Calvin, 1983 ref). Instead, creative people seem to be able to use their whole brains effectively. Creative designers imagine ghestalts -- whole, complex patterns -- that can be translated into real- world materials and shared cultural symbols. The artist Michaelangelo wrote that he "saw" his sculptures in the stone and only had to take away the extra material around them. Karl Marx said that the difference between an architect's building and a bee's hive is this human ability to build in the imagination.
The effects of computer-aided design are controversial. Proponents argue that CAD frees designers from time-consuming drafting and calculating chores, enabling them to try out more imaginative designs. CAD critics question whether such programs really encourage human creativity or, like lego sets and coloring books, they limit the range of possible plans. The effects of CAD on software design are similar to the use of standard parts for craftsmen. If a cabinet maker's standard parts include screws and nails of certain sizes and boards of different thicknesses, it may have no negative impact on his or her ability to design furniture (and it saves the tedious labor of cutting trees and forming metal parts). If the cabinet maker's standard parts are preformed cabinet pieces that merely have to be assembled, however, little original design is possible.
Lying is a very human phenomenon. We often present ourselves to others as nicer, smarter, more attractive, or more competent than we secretly feel. People deliberately distort information for their own advantage or to try to avoiding hurting others' feelings. Although we have strong norms against lying to gain power over others, we expect "white lies" in polite conversation. We often say: "I'm fine, thanks. How are you?" when we feel terrible.
Lie detectors, according to a review by the Office of Technology Assessment (Saxe, 1986 ref), do not detect lies. They detect the physiological changes that occur when we are emotionally stressed. There are many sources of such stress besides guilt or fear of being caught at lying. If someone lies without guilt or fear, the technology detects nothing. An honest answer to an embarrassing or disturbing question will show evidence of stress. Yet some people treat lie detectors as if they were a technology to revel the truth in others' minds without our having to go through the social interaction processes that establish trust in one another. Trust based on social interaction has been replaced by trust in technology.
At its best, the use of computer technology will give us new power to cooperate and realize common purposes. At its worst, the relationships between people and computers will be substituted for social ones. But before going on to examine in detail the effects of the human/computer interface, we must take a closer look at the process of social change and the question of why people began to use computers at all.