PRIVACY AND SURVEILLANCE IN COMPUTER SUPPORTED COOPERATIVE WORK

Privacy as freedom to be left alone has been distinguished from privacy as freedom from the intrusions of formal institutions and authorities. Analysts of the effects of computerized surveillance in the workplace tend to focus on the latter, neglecting the social characteristics of privacy as a small group phenomenon. Privacy is more than the ability to keep information about oneself away from authorities who might use it to control one's actions. It is also the ability to negotiate mutual demands on time, space, and behavior.

Groups depend on information about members to create solidarity and commitment, develop standards of conduct, and perform group tasks. In exerting social control, informal groups develop unwritten agreements about how much and what kinds of freedom members will have. Individuals agree on how to avoid intrusions on one another's time, space, and behaviors without interpreting "being left alone" as isolation or rejection of social relationships.

When groups become more formal, bureaucratic structures of surveillance by persons in formal positions of authority tend to replace the informal negotiations of small group members in defining an individual's sphere of privacy. But even under formal rules of work, organizations developed informal agreements about actual behavior which differed substantially from job descriptions and company regulations. Professional and technical employees in particular were able to avoid the close formal surveillance applied to factory or clerical workers.

The negotiation of privacy in computer supported cooperative work is essentially a small group political process involving critical issues of how organizational power and autonomy are distributed. Computer surveillance, by embedding relationships of power and authority into computer systems, undermines negotiated privacy by reifying formal rules for workplace behavior and distorting basic communication processes.

Workplace privacy cannot be guaranteed by software designers, although they can be concerned with supporting the privacy negotiation process and with avoiding the potential for abuses. Designers and implementers of CSCW systems should keep in mind that privacy is not merely absence of information or isolation from interacting with others. Privacy in small groups is the result of a dynamic process in which people manage mutual intrusions and avoidance while remaining connected to one another. If privacy issues are viewed only as relationships between individuals and formal authorities, the effectiveness and quality of life of working groups will be undermined. As an example of this, consider the difference between what people say to one another in a project meeting among themselves and in their formal progress report.

PRIVACY AND SURVEILLANCE IN COMPUTER SUPPORTED COOPERATIVE WORK

In considering the effects of computerized surveillance in the workplace, analysts have tended to make two assumptions which obscure complex relationships among surveillance, privacy, and group processes. When privacy is implicitly defined as the absence of organizational surveillance of individuals, the importance of mutual surveillance for the functioning of small groups is ignored. When group work is is approached as if it were a matter of coordinating the tasks of individual group members, the social processes by which small groups form solidarity and exert informal social control over members' behavior are ignored. Designers of computer-supported cooperative work systems (CSCW) are beginning to create software to facilitate groups of people working together on the same task, rather that simply coordinate groups of individuals working on separate tasks. (footnote 1) In doing so, they have inadvertently created new possibilities for organizational surveillance of employees.(footnote 2) In addressing workplace privacy concerns, it is important for CSCW designers to appreciate the way these issues affect groups, not just individuals.

PRIVACY

Privacy as the freedom to be left alone by other individuals and by social groups has been distinguished from privacy as freedom from the intrusions of formal institutions and authorities into personal life. Analysts of computerized surveillance in the workplace have tended to focus on the latter, neglecting the social characteristics of privacy as a small group phenomenon. Privacy involves more than the ability to keep information about oneself away from authorities who might use it to control one's behavior. It also involves the ability for individuals, small groups, and organizations to negotiate mutual demands on time, space, and actions.

According to Barrington Moore, Jr. (1984), privacy involves both the right for individuals to be left alone by others (family, neighbors, co-workers, etc.) and freedom from the intrusions of formal bureaucratic institution and authorities. The former is guaranteed in all societies through social rules that specify the situations in which people's behavior is to be ignored -- in other words when they are to be free of informal social control. Often this does not mean isolation from others, just being left alone. The latter sense of privacy is formally established in law in modern democratic societies and defines limits on surveillance and institutionalized social controls (such as the police). There is also a third sense of privacy in which social groups are left alone by formal authorities and are free to pursue their own goals, including exerting social control over their members. This is similar to what Ferdinand Shoeman (1992) refers to as the "context" in which privacy is an issue. Families and religious groups have traditionally had this sense of privacy in the U.S., although this is now being eroded in such areas as child abuse. This third sense of privacy seems particularly relevant to the topic of computer supported cooperative work, since working groups need some autonomy from their organization in order to engage in the processes by which small groups form and influence their members' behavior.

Privacy as Freedom to be Left Alone

The desire for "protection or escape from other human beings, emerges when an individual becomes subject to social obligations that that individual cannot or does not want to meet" (Moore 1984:268). Pre-industrial societies evolved a variety of mechanisms for individuals to avoid temporarily and in more or less socially accepted ways the demands of their communities. By our own standards, the opportunities for privacy were often limited -- all bodily functions, for example, might be performed in front of other people. "Being sick" is a modern example of privacy in this sense. If we are ill, we can stay home and avoid social interaction. However, the urgings of our friends to "get well", the invasion of our body by medical personnel, and organizations' insistence on "a note from your doctor" are social control mechanisms that limit our freedom to be privately sick in the interests of getting us back to our normal social roles (Parsons 1951: chapters 7 and 10).

When our right to be sick whenever we choose is interfered with by the health care system and by our employer, we experience invasion of privacy in Moore's second sense. This form of privacy involve the concept of individual's rights against external authorities. In modern societies people seek privacy from organizations and institutions as well as from one another. The privacy debate over computer use for government and corporate record keeping is about this second sense of privacy -- the right of individuals to be left alone by their government and by the economic organizations of their society. This right of privacy is often in direct conflict with the norms governing our obligations to work and to obey society's laws and regulations.

The legal concept of privacy in the United States was based on the idea that the individual, rather than the group, is the basic unit of society. (footnote 3) In 1890, Attorneys Samuel Warren and Louis Brandeis published a landmark legal opinion extending the common law right to life to "the right to be let alone" to enjoy life. The right to property was extended to the right to own intellectual property -- including information about one's personal life. Contemporary analysts have located privacy in the Bill of Rights of the U.S. Constitution. The right to exchange information is based on First Amendment freedom of speech. Private telephone conversations are protected by the Fourth Amendment's ban on unreasonable searches and seizures. The 1973 Roe v. Wade decision extended Fourteenth Amendment protection to the right to give and receive information. To U. S. Congressman Sam Ervin, "privacy is a catchword for the control that the individual exercises over information about himself" (1983:160).

Privacy as Control Over Information about Oneself

Ownership and control are attached to information as with anything else -- through possession legitimated by law and custom. We are now going through a period of social and technological change in our fundamental conception of intellectual property that may transform the social definition of information as radically as the emergence of private ownership transformed land use in the centuries before the industrial revolution (Perrolle 1987). Our concern about surveillance and privacy in CSCW is only a small part of our response to great change in the conditions of work and life in contemporary societies. By asserting our right to "own" information about ourselves, we have already agreed that information is a form of property. And most of us recognize the rights of employers and governments to exert control over employees and citizens. The source of our anxiety is not the principle of formal institutional surveillance and control over us -- that is implicit in our acceptance of life in a modern capitalist nation. Rather, we are upset by the technological possibilities for a vastly expanded exercise of corporate and government ability to keep track of us.

Culture, defined as the entire way of life shared by a people, includes implied social contracts as well as formal rules and laws. Much of culture's contents is public domain information -- ideas, beliefs, languages, history, scientific knowledge, and so forth. Smaller groups have subcultures with their own shared body of knowledge and rules of behavior. By tradition, some personal information about individuals is the public property of their societies or subcultures. For example, few people would consider private their physical location while crossing a street. Most of us agree that this information should be freely available to drivers and other pedestrians without our permission. This sort of surveillance supplies the information we must have about one another in order to live as social beings. Different cultures and subcultures have developed quite different implied social contracts regarding the personal and the public. Some families, for instance, give one another considerable personal privacy; in others, individual actions and communications are subject to constant scrutiny.

Organizations have subcultures that define implied social contracts for employee and management behavior. (footnote 4) Yet, because these corporate cultures differ among themselves, it is impossible to anticipate precisely what sorts of intrusiveness and what areas of personal privacy will be desired by CSCW users. This can be seen in the very different results of experiments with active badges. Information about where you are at all times during working hours is defined as public information in some organizations and as private in others. It seems clear that the introduction of active badges into the latter type of organization will be enormously disruptive, with employees objecting to change in their tacit social contract. In the former type of organization, active badges will provide technological support for information that already "belongs" to the organization and its subgroups. This shows the futility of trying to analyze privacy issues related to new technologies based on inherent properties of the technologies. It is the cultural context that defines privacy, both through formal negotiations that result in laws, regulations, and employment contracts as well as through the informal social interactions that produce implied social contracts.

Privacy as Negotiated Access Restrictions

Privacy is a complex, multilayered set of arrangements, both formal and informal, which regulate the exchange of information about individuals and groups. If the purpose of CSCW is to support the actual activities of cooperative groups, some attention must be paid to the way groups negotiate privacy. This involves investigation of at least two levels of privacy: 1) the individual's privacy from the demands of the group and the organization; and 2) the group's privacy from the organization. The former includes, for example, questions about how accessible individuals must be. The latter includes issues such as how company time and resources may be used by workgroups.

The totally private individual does not exist in the workplace, nor in any other formal or informal social organization. Groups of all sorts depend on information about their members' behavior in order to define group membership, to create solidarity and commitment, to develop and enforce standards of conduct, and to perform group tasks. (footnote 5) But in exerting social control over their members, informal social groups develop unwritten agreements about how much and what kinds of freedom members will have. Individuals agree on how to avoid intrusions on one another's time, space, and behaviors without interpreting "being left alone" as isolation or rejection of social relationships. This often involves contradictions in behavioral norms. Since isolating oneself from the group is a form of deviance, members agree in principle not to do so. But in actual practice members tacitly negotiate the circumstances under which failing to interact with others or neglecting to inform others of one's whereabouts or actions is allowed or even encouraged.

Individually Negotiated Privacy.

Individuals frequently negotiate privacy in the workplace by claiming to be busy at some task of higher priority than interacting with managers or co-workers. They can also claim to be engaged in a culturally agreed upon private activity such as going to the bathroom or being on their lunch hour. In the case of the lunch hour, most organizational cultures define such times as freedom from official intrusions but vulnerable to social demands of co- workers. Negotiations over "Shall we go to lunch?" or "Can I call you about it on Sunday morning?" are a non-trivial part of how newly formed workgroups begin to establish limitations on members' access to one another. Non-verbal cues (c.f. Druckman, Rozelle, and Baxter 1982), such as an office door closed or looking at one's watch or one's work after returning a greeting, are subtle indicators of a desire to be left alone. People tacitly learn and unconsciously interpret these signals as they negotiate mutual intrusions.

In electronic communications subtle cues are lost (Kiesler, Siegel and McGuire 1991), but technology can sometimes be used as a substitute. For example, answering machines can be used as call screening devices to let you know that someone has sent you a message without letting them know that you have just received it. You can then decide whether to be "at home" or not. But, by putting the power to decide if a conversation will take place in the hands of the receiver, answering machines remove some of the social controls on communication. If you don't answer your phone when someone calls, the penalty is that don't always get the message. With an answering machine you acquire the power to receive information without the obligation of speaking to others. Telephone tag is a new version of the game of who is more important than whom. It is an example of how the quest for individual privacy involves issues of status, power, and autonomy as well as freedom from intrusion.

Group Negotiated Privacy.

Social interaction theorists sometimes refer to group privacy as a "backstage" area in which the group's behavior is not on public view (Goffman 1961). Social interaction in their shared private spaces provides groups with social control over group resources and activities away from the surveillance of organizational authorities. Limiting access by outsiders helps a group define its membership and boundaries. Having their own resources to allocate to or withhold from members helps a group maintain internal social control. Backstage activities, such as making mistakes and wasting company time in socializing, are necessary, inevitable, and do not exist in the group's frontstage presentation of itself and its completed work. Eating together, planning recreational activities, and personal conversations are important ways small groups develop solidarity and enhance member commitment. They are often formally defined as a waste of company time and resources, so must occur within a sphere of group privacy.

It is important to realize that the implied social contracts negotiated by groups often contradict the formal rules of their organizations. For example, whispered asides do not exist in Roberts' Rules of Order. What makes an aside private is not that other people can't hear it but that it is ignored as "not part of" whatever discussion is going on. One thing that asides do, however, is usually indicate a lack of respect for whomever is speaking at the moment. They are part of the informal way audiences communicate with a speaker. In most electronic communications systems we haven't really worked out the conversational problem of turn taking and seem to have lost the nonverbal indicators of boredom and disinterest. So we are unable to negotiate an informal end to someone's speech using customary body language and noises. If we formalized group meetings completely, individual participants would lose some of their power to reject demands on their time and attention.

The Case of Electronic Mail.

Electronic mail is a particularly interesting case of negotiated privacy. At the societal level, we have not yet formalized a uniform set of access restrictions for it. If it is defined as like paper mail, the privacy of its contents is guaranteed by law. As with e-mail, actual protection of mail privacy depends on other people's respect for it. Since mailboxes are relatively easy to pry open and since many workplaces deliver mail in open mailboxes in a public room, your colleagues may be physically able to browse through your mail. But, since mail is socially defined as your private property, they would hate to be caught doing it. However, if we agree that e-mail is like things posted on bulletinboards, then it is acceptable for others in our organization to look through it. Though it would be unacceptable for a stranger to wander through our organization reading the walls.

The way individuals are expected to respond to e-mail messages is also negotiable. Paper mail takes time and can be answered after a socially acceptable delay. Express mail and Faxes have shortened expected response time. We have lost one means of negotiating extra time, the polite fiction: "The mail hasn't arrived yet." Telephones, which used to demand immediate attention, have been modified by voice mail and answering machines to lengthen expected response time. The time demands of e-mail are currently being negotiated. I have been experimenting with reading my e-mail once a week; under pressure from my colleagues I am beginning to attend to it more often. Through subtle exercise of normative power my colleagues are gradually wearing me down, and I seem likely to join the ranks of the information overloaded. In many organizations electronic mail runs constantly on employees' desks, and new messages beep for attention; I hope that my organization will not be one of them. The decision about what form electronic mail will take at my university will be made by the administration in consultation with computer vendors and a faculty committee. Unfortunately the choice seems to have been defined by all involved as a technical rather than a social issue.

We have also not decided on a single definition of to whom the contents of e-mail messages belong. Most companies claim ownership of communications on their systems and reserve the right to examine them. Most individuals and small groups are used to being able to have private conversations in the workplace, and resent employer eavesdropping. Attempts to solve the privacy problem in electronic mail design must be sensitive to the social characteristics of privacy and realize that actual groups negotiate a variety of different solutions. They should also realize that trying make all communications formal and public can inhibit what individuals say to one another, can make it more difficult for groups to function, and can increase the intrusiveness of organizational authorities into individual working life.

A THEORETICAL BASIS FOR NEGOTIATED PRIVACY

The negotiation of privacy in computer supported cooperative work is essentially a small group political process involving critical issues of how organizational power and autonomy are distributed. An absence of public negotiation of goals and behavior is characteristic of authoritarian, intrusive workplaces. CSCW systems that restrict discussions to technical questions of how to accomplish pre-defined group tasks distort privacy negotiations by embedding relationships of power and authority into the software itself.

Reification

Reification, the embodiment of social relationships in objects, distorts negotiation by making the power of those who design, implement, own, and manage computer systems appear as a natural feature of the working environment. Instead of seeing who is exerting what sort of control over the activities of whom through CSCW, we seem to be in a world of relationships between persons and objects. Our privacy is intruded on by technology rather than by people with whom we can negotiate access restrictions.

Individuals and groups can adapt and learn to function as new technologies do. After all, a species that learned to repress its biological characteristics in order to work on assembly lines and awake to alarm clocks should be able to get used to even the worst of CSCW designs. Our experiences in the workplace can even be transferred to our non-working lives, just as our patterns of leisure and social life have taken on many of the scheduled, goal-oriented characteristics of modern workplace activity. As an example, I once observed the application of a token ring protocol to the activity of a group which had just met at a conference and had gone to a Chinese restaurant. A piece of paper was passed around the table, and each individual added a choice to the list. The usual small group process did not develop very fully. Instead of beginning to learn one another's preferences and interaction styles while observing the spontaneous emergence of possible group leaders, we obtained food efficiently and talked like strangers. Today I cannot remember those present or what we spoke about. Yet I vividly recollect other conference dinners from many years ago where enthusiasm and emotional warmth began a process that resulted in professional collaborations and friendships.

As the process of reification occurs, rational procedures and technologies begin to replace social interactions. The result is a diminished capacity for individuals to commit themselves to groups and for groups to engage their members in collective endeavors. While individuals may appear to obtain more privacy, it occurs by their being isolated from groups. At the same time, individuals and groups lose much of their ability to negotiate privacy as a part of their working life. If CSCW designers mistake isolation for privacy, the social basis for cooperation will be reified instead of supported.

Non-Distorted Communication

Jurgen Habermas' (1979 & 1984) theory of communicative action views communication as a fundamental basis for society. In what Habermas calls the ideal speech situation, all participants have an equal opportunity to participate in non-distorted, rational discourse. In situations where some participants use their social status or their power and authority to inhibit the conversation of others, distorted communication occurs. Since computer interfaces remove individuals from the physical presence of others, the social context cues to status and power are obscured, reducing some of the means by which distorted communication occurs. Yet computer interfaces can reify unequal social relationships in their design, making power and authority appear as features of a world of objects. When this occurs, opportunities for computer-mediated non-distorted communication are limited (Perrolle 1991).

In order to approach Habermas' ideal speech situation individuals in a negotiation make four sets of claims regarding their competence to participate. The first type of claim involves a speaker's linguistic competence -- everyone must be making sense in a language all can understand. In computer- mediated communications, difficulties with typing or otherwise using the system may be interpreted by others as indicators of incompetence. Because typing is slower than speaking, CSCW interfaces using video or voice to replace the keyboard enable individuals and groups to converse more competently. Defining part of the CSCW system (such as data bases, interface characteristics, or even the basic design) as outside the area of a group's expertise reduces that group's ability to negotiate competently. Designer attitudes that users are incompetent to talk sensibly about technical issues is a rejection by designers of users as competent negotiators of their own privacy.

The second type of claim made in non-distorted communication involves the nature of external reality. In ideal speech situations, speakers interrogate one another to establish claims about what is true. In bureaucratic organizations, truth is located in formal rules and official knowledge. In computerized organizations, data bases and technology often reify and limit the objective world of working groups and individuals. To the degree that CSCW systems limit the universe of discourse subject to negotiation, groups lose autonomy. Besides reducing opportunities for privacy, this can have a negative effect on groups' decision making and can restrict an organization's ability to solve problems outside of pre-defined possibilities.

A third claim that participants in non-distorted communication must trust is that each intends intends to have a rational conversation, rather than to intimidate or mislead others to their own advantage. In situations where great differences in status or power distort face-to-face communication, the tendency of computer-mediated communication users to focus on statements of fact and not intentions can facilitate social interaction. In the absence of visual cues, the contributions of low status individuals are not automatically ignored. Yet when conversations take place under the control of organizational authorities, some intentions are reified. Research on the perception of intention (Dasser, Ulbaek, and Premack 1989) indicates that people easily attribute intention to objects. Perceptions of CSCW systems as neutral technology with user-friendly intentions obscures managerial intent such as getting the most work out of employees for the least expenditure in wages and benefits. In other words, any managerial intentions to exert maximum control over employees by surveillance of any lapses in performance will tend to be reified in CSCW systems.

Finally, non-distorted communication must be conducted in socially appropriate ways. Research indicates that computer- mediated communication alters the social norms governing conversation by removing elements of emotion and social control (Kiesler, Siegel, and McGuire 1991). It also provides the possibility of more equal participation by obscuring the visual and verbal status distinctions that give higher ranking or more aggressive people an advantage in face-to-face speech. For example, in face-to-face conversation, women are expected to allow themselves to be interrupted by men (Zimmerman and West 1975). There are similar conversational norms allowing high status people to interrupt low status ones (Molotch and Boden 1985). In circumstances where opportunities for participation are enhanced and opportunities for one speaker to control another are reduced, computer-mediated communication facilitates privacy negotiations. Emotionally-based arguments, which often sway opinions in face- to-face situations, are less likely to influence the outcomes of computer-based discussions. With ordinary mechanisms of social control missing, participants in computer based discussions are more free to develop new and unconventional implied social contracts. The price that small groups pay is a reduction in their ability to develop strong feelings of solidarity.

Surveillance as Reification and Distortion of Negotiated Privacy

Computer surveillance threatens to undermine the social character of privacy in the workplace by providing a technological means to reify large areas of social interaction and distort the small group processes by which individual and group workplace privacy is negotiated. The reification process begins when informal social interactions and implied social contracts are made more formal, more explicit, and more subject to surveillance.

Whenever individuals are being informally watched or listened to, they are somewhat inhibited by what they imagine others' reactions to be. This is how informal social control works in groups. We tend to try not to upset the people around us, especially if they have the means to retaliate in some way. Gossip and other private communications are an ordinary part of both group and organizational office politics. In informal conversation it's one person's word against another's as to what was said and by whom. A private comment can be publicly denied. However, once a conversation gets recorded (for example in a paper memo or in an e-mail message) it assumes a more formal existence. As people begin to suspect they'll be held accountable for their spontaneous utterances, their freedom to express themselves is inhibited. While this may be an improvement in the case of malicious gossip, it also destroys the backstage area where so much of group work is actually done.

When groups become more formal, as has been a characteristic of workplaces since the industrial revolution, bureaucratic structures of surveillance by persons in formal positions of authority have tended to replace the informal negotiations of small group members in defining an individual's sphere of privacy. But even under formal rules of work, organizations developed informal agreements about actual behavior which differed substantially from job descriptions and company regulations. Professional and technical employees in particular were able to avoid the close formal surveillance applied to factory or clerical workers. Many systems now being developed to provide computer support for cooperative work make it technically possible for companies to monitor information about all activities occurring in the workplace or using the company network, including where everyone is at all times and what they are doing. In the worst case of formalized communications we would get a CSCW version of "work to the rule" in which everyone does exactly their job description and no more, and organizations stop functioning due to lack of informal structure and activity.

In general, technological and bureaucratic forms of surveillance are common in settings where there is low trust. Organizations that use these forms of surveillance also create low trust workplaces. Conventional computerized workplace surveillance depersonalizes relationships of power and authority, reifying them in managerial technologies. It also relocates trust from the employee's personal reports of his or her own behavior to objective measurements of performance. If employees were not trusted in the first place, the machine may be preferred to the human evaluator. In the most extreme cases -- such as lie detection technology -- employees accounts of even their intentions are no longer trusted by employers. In its most benign form, computerized surveillance in such workplaces replaces the social indignities of being given orders by another person with the impersonal neutrality of the machine. In its worst form, computerized surveillance establishes relationships of intrusive oppression as part of the external reality of the workplace. It would be technically possible, I imagine, to combine visual surveillance of workplace bathrooms with automated drug testing technology. But for such an Orwellian scenario to be implemented, we would have to be living in a world where social trust had broken down so far that we would find children carrying guns to school to protect themselves. In such a world, the prospects for any sort of cooperative work, computerized or otherwise, seem dim. The challenge for CSCW is to provide computer surveillance systems which support instead of reify social interaction.

CONCLUSIONS: DESIGNING CSCW FOR NEGOTIATED PRIVACY

The ideal of computer support for non-distorted communication among equals in cooperative workplaces has had a powerful appeal to the community of CSCW designers. From the theoretical perspective that privacy consists of negotiated access restrictions, one of the most important human factors in CSCW design is the impact that the system has on users' abilities to negotiate through unreified, non-distorted communications. Negotiating privacy in CSCW is essentially a small group political process involving critical issues of how organizational power and autonomy are distributed. This cannot be determined by software designers, although they can certainly be concerned with supporting the process and with avoiding the potential for abuses.

Supporting Interactional Cues

Designs to support individual privacy negotiations must implement some version of the interactional cues with which individuals make inquiries about one another's accessibility. Bellcore's Cruiser system creates a virtual hallway through which people can stroll and observe who is in their offices. "Privacy blinds" appear as bars across office images, allowing people to mutually observe others' presence while indicating a desire not to be disturbed. There may be other, less expensive ways to implement negotiated intrusions, perhaps by developing simple status indicators of prioritized requests for attention and for being left alone. Other CSCW features, like active badges, can facilitate negotiated privacy if they have "off" switches.

Privacy for Groups

Group negotiated privacy can be supported by such features as group workspaces accessible only to group members. Flexibility in the design of these backstage areas will give user groups the opportunity to experiment in negotiating their own structure and expectations. Automated meeting schedulers could be made less than automatic, allowing users and user groups to indicate several different states of "being available" depending on the nature of the meeting being scheduled.

Avoiding Reified Authority

Reification of authority relationships in electronic surveillance could be avoided if users were notified whenever their e-mail files were read or backed up. While this would not prevent surveillance, it would remind all parties involved that the surveillance is being done by people, not by technologies. If e-mail privacy is part of an organization's implied social contract, notification of intrusions could at least serve as a basis for discussion of organizational behavior.

Supporting Non-Distorted Communication

If we take Habermas' requirements for non-distorted communication seriously as a basis for negotiated privacy, CSCW systems should be extremely user friendly, approaching the ease of face-to-face discussion, to facilitate user competence. CSCW designers should remain aware that the nature of the world represented by computer software and data bases is a socially negotiated one. Insofar as possible, CSCW should not preempt definitions of the subject matter and conduct of work by restricting group activity to pre-existing data bases and work procedures. Intentions to control cooperative work groups should be implement explicitly, preferably through discussions with both employees and managers of organizations that will be using the systems. Finally, attention should be paid to the variety of implied social contracts for appropriate behavior in the workplace, so that CSCW communication systems allow group norms to emerge out of group process. This is quite a different problem from trying to develop formal protocols for small group interactions. Even if we could create a standardized group process by capturing and formalizing group behavior, in doing so we would destroy its capacity to generate new social forms out of participant negotiations. There is no technological fix for negotiated privacy.

Conclusion

Designers and implementers of CSCW systems should keep in mind that privacy is not merely absence of information or isolation from interacting with others. Privacy in small groups is the result of a dynamic process in which people negotiate mutual intrusions and avoidance while remaining connected to one another in a context of power and status relationships. If privacy issues are viewed only as relationships between individuals and formal authorities, the effectiveness and quality of life of working groups will be undermined.

BIOGRAPHICAL NOTE

Judith A. Perrolle is an Associate Professor of Sociology at Northeastern University in Boston, where she teaches students in the College of Computer Science and in the Law, Policy, and Society Ph.D. Program. Her research is on the social impacts of computers and communications technologies, environmental policy, and the sociology of risk.

NOTES

1 See Baecker (1993), Bowers and Benford (1991), Conference on Computer-Supported Cooperative Work (1988, 1990, 1992), Easterbrook (1992), European Conference on Computer-Supported Cooperative Work (1991), Greenbaum and Kyng (1991), Greenberg (1991), Greif (1988), Kensing and Winograd (1991), Marca and Bock (1992), Power (1993), and Sharples (1993). 2 See Computer Professionals for Social Responsibility (1991), Deutsch (1986), Hoffman (1980), Law Reform Commission of Canada (1986), Marx and Sherizan (1986), Mendes (1985), Shepard and Duston (1987), U.S. Congress, Office of Technology Assessment (1987). For discussion of pending legislation see U.S. Senate, Committee on Labor and Human Resources (1993). As of April, 1994, U.S. Senate bill S-984 (which restricts workplace surveillance to performance evaluation only) was still in committee. 3 For a review of legal status of privacy see Seipp (1978), Smith (1981), and Westin (1967). Also National Commission (1976), for history of legal status of electronic surveillance (mostly in criminal cases). 4 For an introduction to corporate culture, see Deale and Kennedy (1982), Ouchi (1981), Schein (1985), and Smircich (1983). 5 There is a large sociological literature on small groups, including Homans' (1950) and Olmstead and Hare's (1978) theoretical approaches. Good reviews of the classical literature include Cartwright and Zander (1968), Crosbie (1975), and Hare (1962). Studies of programs to train employees for cooperative work groups can be found in the education literature. See for example Dahlstrom (1994) and Slavin (1983).

REFERENCES

This is a conference paper version of an article copyrighted by the University of Wisconsin Press
Judith A. Perrolle e-mail: perrolle@neu.edu
Department of Sociology and Anthropology
Northeastern University, Boston, MA 02115