Computers and Social Change
[ Main Index | Chapter Index ]

Computers and Social Change
Part Four. Information, Property and Power in Democratic Institutions


What right do you have to be reading this book? Who does the information in it belong to? What right does anyone else have to keep track of what you read, say, or do? Should people who know the most be able to make decisions for the rest of us? These questions are about information and power. To the individual, they appear to be questions about personal rights to acquire information while protecting their privacy. But they are also questions about power in democratic institutions. There is a tension in democratic societies between individual freedom and social control. In our society, individuals have the right to privacy and to own property; the society has the right to limit individual freedoms in the public interest.

The social concepts of privacy and freedom of information are often in direct conflict with the concept of information as property. The law, which defines and guarantees our rights in a democratic society, is being transformed to recognize and protect information as a form of commodity property. New laws regulating the ownership, taxation, and liabilities of information products are being proposed, and old ones are being reinterpreted. As lawyers struggle with cases involving computer software and data bases, computer law is emerging as a new specialization within the legal profession.

In law enforcement, the protection of information is of growing concern. Computer-aided crime has emerged as a new form of crime against property. Problems of prevention, detection, and prosecution are challenging computer professionals, auditors, and law enforcement agencies. Such problems also challenge the widespread beliefs that white collar crime is not very serious and that computer-aided breaking and entering is a clever prank.

Privacy implies the right to be left alone; mechnisms of social control enforce social obligations to the common good. The belief that liberty and social order ought to be balanced is a basic value of democratic institutions. The Computer Revolution is upsetting the existing balance by shifting information from the public to the corporate domain and by increasing the information available to agencies of social control.


If you bought this book or support a library through tuition or taxes, you used your economic power to acquire an information product. As an author, I receive about a dollar for each copy sold and have an economic interest in selling as many as possible. As a teacher, however, I am interested in making information freely available. Thus I loan my printed copies and make a computerized text file available. My students and colleagues apply normative power by expecting me to give them information in my role as a college professor. The cultural value that production should return a profit to the producer is not the dominant value of non- profit academic institutions. My personal conflict of interest is reflected in a legal tension between the rights to own and to have access to information.

10.1.1 Information and Change in Legal Institutions

Information has always belonged to people in the sense that those who knew something have used, taught, or traded their knowledge for goods and services. Much of our cultural information, however, is collective property belonging to the whole society and available to anyone who is able to learn it. According to Durkheim, all property rights developed out of religious rituals expressing group solidarity (1957:168). From this perspective, information in the form of communications and shared symbols can be viewed as part of the remaining sacred quality of human group life. To Marx property rights grow out of a division of labor in which surplus products are unequally exchanged. From both of these perspective, cultural information that is produced and shared by everyone in society is not private property. The argument that computers accelerate the trend towards information as property focuses on one particular form of property -- the commodity. As discussed in Chapter 3, commodities are different from personal or group property in that they are made in order to be sold for a profit rather than for other forms of use or exchange. They are usually the property of corporations which produce them rather than the personal property of the employees who actually do the work of making them. The Legal Status of Information. In order for information products to be sold, law and social custom must recognize them as marketable commodities. A major difficulty with obtaining and enforcing property rights for information products is the traditional social distinction between material objects (which can be owned) and abstract information (which is an intangible personal possession or a part of shared culture). Although several areas of law define some kinds of information as property, people often act according to the belief that reading, using, or copying information is an individual right. While most people would not break into others' houses or steal their cars, many see nothing wrong with reading their magazines or gossiping about their secrets. Unless social change occurs to make people share a cultural belief in information as property, violations of copyright and other information protection laws will likely continue.

One area of law in which existing statutes seem inadequate is that in which the property involved is information. Electromagnetic impulses transmitting stolen information have been judged not to be property under the Interstate Transportation of Stolen Property Statute (Ribicoff, 1978:131). Yet the electromagnetic coding on a computer tape (and not simply the tape itself) have been found to be property for the purposes of state taxation (Vt. software tapes..., 1983). In the case of tax benefits for software exporters, the requirement that products be "tangible" may cause the U.S. Internal Revenue Service to deny benefits to the software manufacturers (Benoit, 1984). The Library of Congress Copyright Office 1975 regulations required that machine-readable copies of computer programs be accompanied by a reproduciton or description that can "be perceived visually or read by humans" made the copyright status of software unclear until later legislative amendments.

The question of who owns the information produced by artificial intelligence software has yet to be addressed by the law. Legal institutions respond rather slowly to rapidly changing technologies:

The new technology is asking questions that the 1984 Copyright Act cannot answer. Eventually, either the courts will answer the questions that the act does not, or Congress must produce new forms of protection that are appropriate to new technologies (Gemignani, 1985:52). Change in Legal Institutions. The legal system is one of the most conservative of social institutions. In his essay on law and social custom, Thorstein Veblen observed that change in legal institutions occurs slowly, especially when it runs counter to established social principles or vested economic interests (1969:34). Although the vested interests of information producers lie in changing the law in the direction of greater protection for information products, the principles of constitutional law in many ways define information as belonging to individuals or to society. In addition to new legal concepts of property, the social institutions of the legal profession and the administration of justice must adapt to the pressures of the Computer Revolution. Existing law must be applied to cases involving computers, and new legal concepts developed where traditional ones prove inadequate. Computers and the Practice of Law. Law has traditionally been practiced by lawyers who have access to a complex body of information which is relatively inaccessible to individuals without legal training. Although modern law is, according to Max Weber (1978:895), "a rational technical apparatus, which is continually transformable in the light of expediential considerations and deviod of all sacred content", the application of computers in rationalizing the practice of law is likely to be vigorously opposed by lawyers and their professional organizations if that rationalization substitutes information systems management for the activities of lawyers.

In 1958 legal expert Lucien Mehl proposed that:

a machine for processing information can be an effective aid in searching for sources of legal information, in preparing the decision of the administrator or judge, and finally in checking the coherence of the solutions arived at (1959:757).
But he insisted that, although judicial machines would be suited to conduct legal argument, they could never replace human legal experts because they were incapable of formulating precepts.

While expert systems developers would claim that computers do have the technical capabilities to "replace" many of the functions of lawyers, the trends in computer usage indicate that they are being adopted in ways which facilitate the existing arrangements of legal practice. Although computer programs could be developed to render rational judgments for some sorts of cases, the human quality remains an almost sacred element in the administration of justice; we are thus unlikely to experience computerized judges. Most legal experts would agree with Joseph Weizenbaum (1976) that any conceivable intelligence on the part of a computer would lack the element of human wisdom. Even the use of computers as "informants" or providers of "expert" information is controversial (Marx and Richman, 1984; Jenkins, 1979).

What we can expect is an acceleration of the use of computers to process court cases (now terribly backlogged in most jurisdictions) and to provide legal research services to attorneys. The Lexis and Westlaw systems are examples of specialized database services for legal research (Bander and Sweetgall, 1983). Their use may lead to concentration of power may occur in larger law firms, which are able to afford legal data base services. An alternative is to make these services available inexpensively to individual lawyers and small firms at government law libraries. Also, we may expect computer law to become a professional specialization. By 1985 there were over 1000 lawyers belonging to national and regional computer law organizations (Connolly, 1985). Information Product Liability. A new legal concept of information as property seems likely to create new conceptions of product liability law. These are the laws that hold companies responsible for product safety and performance "as advertized". As the law defines information property in order to protect software and databases, new legal liability problems arise. Under the Uniform Commercial Code adopted by most states, people injured by a product may sue the original manufacturer. If computer software is considered "goods" under the law, software suppliers could be held liable for damages caused by program errors. Computer professionals have been held liable for malpractice claims similar to those made against doctors and lawyers. Those whose software does not work as claimed can be found guilty of fraud (Mislow, 1984; Beeler, 1985).

Older industries use liability insurance to cover the costs of damages their products might do. Liability insurance is now being sought to protect hardware and software manufacturers (Cottrell and Weiss, 1984). Although some computer vendors have tried to avoid liability problems by using contracts in which the purchaser agrees not to hold the supplier at fault, courts generally do not recognize these agreements (Miller, 1984; Kutten, 1985). What this means if you are a software customer is that the company is responsible for undisclosed defects no matter what you signed to the contrary. Warranties (money-back guarantees) have been recommended as a way to satisfy customers and cut down the risk of lawsuits (Stewart, 1985).

Liability law has been connected to issues of public access to information in the case of computer bulletin boards. Charging that bulletin board operators are responsible for allowing users to exchange pirated programs, some software manufacturers have sought damages. An additional risk of legal action against data base owners comes from libel law. A large financial services corporation was found guilty of libel for incorrectly reporting as fim as being bankrupt (Rifkin, 1985).

10.1.2 The Changing Status of White Collar Crime

Law is divided into criminal and civil law. Historically the distinction arose according to whether the goal was to punish offenders or to compensate those offended against. Offenses against property can be civil wrongs (called torts) as in the case where you damage someone's automobile and must pay them compensation. Theft of an auto is an example of a crime that results in punishment rather than compensation. High- status people are more likely to appear in civil court, facing fines, while lower class individuals more frequently wind up in criminal court, facing prison sentences.

The term, white collar crime, was first used by Edwin Sutherland (1949) to describe crimes committed in the course of a person's professional occupation. Most computer crimes fit this description -- they are cases of theft or fraud committed during work as a business executive or employee with access to financial data. All sorts of white collar crime tend to be dealt with in civil rather than in criminal court; even in criminal court the status of the defendants and the non-violent nature of their offense tends to produce lenient treatment. What underlies the leniency is our apparent failure to consider white collar crime a serious criminal offense. Fraud and the theft of money or property are illegal. Yet someone who robs a liquor store of a small amount is more severely prosecuted than a bank employee who amasses a fortune by stealing a fraction of a penny from every customer's interest payment. Computerized White-Collar Crime. The growing reports of computer-aided crime are alarming to business and financial institutions because they are beginning to realize how far auditing and financial security procedures have lagged behind data processing innovations. The "paper trail" used by auditors to track down many white collar crimes has turned into an "electronic trail" which is difficult for all but the most computer- sophisticated accountants to trace. And that is assuming that the electronic trail has not been completely erased from the computer's memory. Although the loss of money due to computer fraud is impossible to measure in the absence of adequate statistics, it is estimated to be our fastest growing criminal problem (Parker, 1983). Only 5.1% of the Justice Department's 1977 and 1978 resources were devoted to white collar crime of any sort, and there are no statistical equivalent of the FBI's Uniform Crime Reports to cover it (Simon and Eitzen, 1982:21). The FBI is now planning a computerized file on suspected white-collar criminals, but civil liberties groups are protesting on the grounds that information on suspects and their "associates" often results in the computerization of vague suspicion and unsubstantiated charges and is likely to lead to abuse by legal authorities (Burnham, 1984).

Federal legislation enacted in 1984 makes it illegal to gain unauthorized access to government data or financial records (such as banks) covered by federal laws, though its coverage falls far short of what supporters had sought (Betts, 1984). Senator Abraham Ribicoff had introduced the proposed Federal Computer Systems Act of 1977 to make it a federal crime punishable by up to 15 years in jail or a $50,000 fine to "access a computer for the purpose of perpetuating fraud or obtaining money, property, or service under false or fraudulent purposes" (1978:135). Many states now have enacted similar laws governing computer crime (Bloombecker, 1984). The problem with prosecuting computerized criminals, however, has not been due entirely to the lack of laws against their activities. Instead, it is part of the more basic problem of how white collar crime is treated by society.

Contrary to the popular image of the computer criminal as an electronic wizard matching wits with sophisticated electronic equipment, most of the crime so far detected has been conducted by people with a fairly limited knowledge of computers, using the data-processing systems of their own workplace (Whiteside, 1978). In a survey of computer-related crimes in government, over 60% were found to be cases of entering fraudlent records into data bases (Whiteside, 1978:143). More recent findings of the U.S. Department of Health and Human Services (1985) and the Data Processing Management Association (1985) reaffirm the portrait of the computer-aided criminal as an "insider". Technically, the problem cannot be solved by sophisticated software to keep out unauthorized users or to examine the incorrect data. Better procedures for verifying the data which are entered into the system must be developed. And these procedures must be developed as patterns for the behavior of individuals within social organizations rather than as computer software. As David Dery says in his study of computers in welfare (1981:10), "the chief impediment ... is not technical capabilities, but organization".

The distinction between criminal and civil law may become less significant in the Information Age. As Steve Blum-West and Timothy J. Carter (1983:549) argue, the handling of federal regulatory law violations (which most white-collar crimes are) as civil or criminal is discretionary and cannot be determined from legal concepts. What this means is that the distinction between "white collar" and other crimes against property is a distinction based on social status of the offenders rather than on formal legal principles. As computerized white-collar crime grows more serious, we may find crimes against information products being treated like electronic breaking and entering, and handled in criminal rather than civil courts. Corporate Crime. In a newer theory of white collar crime, Simon and Eitzen (1982) define much of it as elite deviance. Elite deviance is illegal, unethical, or immoral acts committed by the members of the highest strata of society for purposes of personal or organizational gain. Simon and Eitzen point out that elite deviants run little risk of detection or prosecution, although they often create great dangers to the wider society. Examples of this sort of deviance by corporations are illegal disposal of toxic waste products or large-scale fraudulent financial manipulations like the First National Bank of Boston "money laundering" operation (Business Week, March 11, 1985:37). A U.S. Government General Accounting Office study of the Defense Department found fraudulent contractors to be the major source of waste (Common Cause, 1983:39).

Another example of corporate deviance is welfare fraud. The use of record matching by police and federal agencies has been widely publicized as a way to detect cheating among the relatively powerless welfare recipients ("Are electronic foodstamps on the way?", 1984). However, the majority of financial losses in the welfare system are to socially "respectable" organizations. Investigations of the Medicaid program identified nursing homes, pharmacies, and hospitals as some of the major culprits (Common Cause, 1983:47). In the Information Age, those individuals and corporations with the most power over information resources are in a position to conduct financial manipulations, fraudulent contracts, deceptive advertizing, and other illegal business and political activity. Although some large corporations have been fined for defrauding the Pentagon or illegal banking operations, little of this large-scale organizational crime is prosecuted. As in the old English poem, computer crime legislation is designed to punish those who steal the privately owned information "goose", not the publicly owned information "common".

10.1.3 Protecting Information Products

Despite technical innovations to protect information from employees, from one another, and from outsiders, companies are facing an increase in white collar crime. The failure of technlogical solutions to the information law and order problem leads companies to seek legal protection. The U.S. laws protecting information products are state trade secrets laws, federal patent law, and federal copyright law. Abstract ideas are protected only if they are trade secrets; ideas for the design of devices and industrial processes can be protected by patent if they meet a set of stringent criteria; the expression of an abstract idea in human-readable form can be protected by copyright. In the absence of other clearly defined laws to protect information products, leases, purchase agreements, non-disclosure agreements and other forms of agreement among vendors, customers and employees have become a mainstay of software and hardware protection (McEnaney, 1984; Roberts, et. al., 1985). Like other forms of legal guarantee, contracts are based on shared social beliefs in their legitimacy. The Failure of Technological Protections. Although most computer crime is commited by authorized users, existing security arrangements have not usually been very effective at keeping out the minority of unauthorized users. The reasons for this are not those popularly assumed. Most publicized "piracy" or "computer breaking-and-entering" is performed by people with only moderate technical skill. The reason they are successful is that most data are very poorly protected. For example, a software pirate I've met learned to copy all sorts of personal computer diskettes but did not know how to program. He had mastered the techniques without understanding what he was doing; he acquired software he didn't know how to use.

In making the transition from conventional record keeping to computerized data, most organizations relied on the social patterns that keep most strangers from walking into an office and browsing through filing cabinets. They also depend on shared social values that define theft and fraud as "wrong". With computers, however, ordinary forms of social control that depend upon personal surveillance of offices do not apply. In the case of information products, social respect for property rights is not universal.

As owners of data bases and other information products become more concerned over the vulnerability of their property, more elaborate techniques are being adopted to provide security. Some are as simple as realizing that "erased" diskettes and tapes still contain company data (Raimondi, 1985). Others are protection schemes are designed to recognize authorized users. Identification systems include long passwords (which are difficult to determine by experimentation), electronic keys that restrict access to buildings or equipment like Xerox machines, data encoding schemes based on new developments in mathematical cryptography, and even voice or fingerprint recognition (Albert and Morse, 1984; McEnaney, 1984; Wu and Hwang, 1984).

An interesting variety of measures have been developed to make software copying diffcult. In the personal computer software industry, techniques have tended to focus on diskette protections. Variations in recording techniques and software that checks its disk to ensure that it has not been copied to a new one are in use. Read-only memory firmware (programs built into the hardware) are also relatively difficult to duplicate. Also, many companies will not sell documentation except to customers who purchase their software. Although these protective measures may discourage the novice, the technically skilled make short work of them. Among both professional programmers and hobbyists, software protection schemes are often considered a form of challenging computer game. This attitude has been reinforced by popular images of home computer pirates as young people engaged in essentially harmless play. It has been estimated that only 10% of the home computer software in current use has been legally sold.

The same experts who design computer security measures are able to bypass them. What high school computer science students do as a game, high-tech criminals can do for profit (Milwaukee Discovers 'WarGamesmanship', 1983). The most striking example of technical expertise in unauthorized access is the record of the National Security Agency's Tiger Teams at ZARF. ZARF was a government project to investigate computer security arrangements. Operatives claim to have broken into every computer system ever marketed. They also seem to be involved in setting standards for private industry "carefully designed to be just secure enough so that corporate spies outside the government could not break a user's code and just vulnerable enough so that the National Security Agency could break it" (Computer Encryption and the National Security Agency Connection, 1977). In 1985, the NSA offered to provide banks and companies with secret protection codes -- so secret that only NSA would know how they worked! (Science, October 4:45-46).

As organizations use them to reduce the vulnerability of their computerized records and proprietary information, we may expect a greater share of the assaults on information property originating outside of companies to be conducted by technical experts. Also, as outsiders are discouraged, an even larger share of computer crime will be perpetuated by authorized users. Trade Secrets. These are secret ideas, processes, designs, or other proprietory information. Once communicated to the public (by accident or otherwise) they lose their protected status. Companies protect trade secrets with employee and customer contracts prohibiting disclosure of company information. Violators of these contracts may be prosecuted, but anyone else is free to use the information revealed. The advantage of trade secrets for companies is that the information does not have to be filed with the patent or copyright office and because there is no expiration date for the protection.

An example of a trade secret is Kentucky Fried Chicken's recipe of eleven herbs and spices. It can be sold, as it was when Colonel Saunders retired from the fast food business, and can be widely imitated without penalty, so long as the secret was not stolen from the original inventor. Computer programs and hardware often include trade secrets in the form of design features not protected by copyright or patent. These are difficult to conceal from skilled programmers and other technical people. A better legal protection is the use of non-disclosure agreements which users must sign. These contracts are designed to protect the proprietary information being provided by the company to its customers. In practice, however, design features are commonly copied among companies and by customers.

Trade secrets are a modern version of the way medieval guilds and ancient craftsmen protected their knowledge from competitors. As in the cases of ancient Chinese pottery glazes or Egyptian embalming techniques, trade secrets are vulnerable to being forgotten. In the early years of programming, many skilled software developers avoided documentating their programs. In a personal version of trade secrets, programmers protected their work (and occasionally their jobs) by making their programs unintelligible to others. Although the rationalization of the software industry has led to better documentation of software products, many companies try to prevent copying by making it difficult to obtain documentation for their products. Others supply customers with binary object code (data in the form of ones and zeros) rather than source code (a higher-level language version). This protects products by making them difficult to understand. Some individual programmers still try to protect the secrets of their trade by producing programs that are undocumented and difficult to follow the logic of. Understandably, this practice is discouraged by technical management. It can also be a problem for the programmer who finds it difficult to debug or modify his or her own work. Patents. As provided in the U.S. Constitution, patents are granted to the inventors of innovative processes and machines. Mathematical formulas or "obvious" processes cannot be patented. A patent prohibits others from using the design for 17 years, but is expensive (around $5000), time- consuming to obtain, and require a full disclosure of the process being patented up to several years in advance of the patent's being granted. Until 1981, software was not generally considered patentable; recently a few software patents have been granted where the software was an integral part of a patentable process. The reason that more software is not patented is that algorithims are considered mathematical formula, and are thus not eligible for patents. Patents protect ideas only when they are implemented in some concrete device. They are not given for abstract intellectual concepts or mental processes.

Legislation has been introduced by Senator Dole (Congressional Record 131, No. 1, part II, S186 3 Jan 1985) would assign patent rights to technology developed with public funds to corporations. Critics charge that this would be a change from the original intent of patent laws -- the reward of individual creativity (Frenzen, 1985). Already many high-tech employees sign contracts giving up claims to patents for anything they might invent on company time.

Internationally, patent law has been characterized as inadequate for the microelectronics industry (Braun and Macdonald, 1982:131-132). In a recent case, the Japanese Matsushita Corporation has been sued by Stanford Ovshinsky over the rights to the erasable optical disk (Fortune, June 13, 1983). From the perspective of U.S. firms, international patent law needs to be strengthened to protect their property. From the perspectives of developing countries, however, the ownership of patents by the industrial countries and multinational corporations means that their technology remains under foreign control. In a 1964 United Nations study 89% of the patents being used in five developing nations were foreign owned. In Chile in 1967, 95% of patents in use were foreign (Barnett and Muller, 1974:140). Some developing countries, such as China, have adopted laws protecting U.S. and other international patents. Their intent is to encourage foreign investment and marketing (Braun, 1985).

Some of the more interesting future patent cases may occur in the field of artificial intelligence. Already patents have been issued for manufactured bacteria, thus eroding the social distinction between manufactured devices and natural organisms. It will be interesting to see if the heuristic programs of artificial intelligence (which are not, mathematically speaking, algorithims) will be considered abstract intellectual processes or ideas embodied in an innovative mechanism. At present, however, patents are mainly used to protect computer hardware. Under the Semiconductor Chip Protection Act of 1984, computer chip design masks receive a 10 year protection under the copyright laws (Wartel, 1985). Copyrights. Copyrights do not protect ideas; they protect the original expression of an idea "fixed in any tangible medium". In the case of computer programs, copyrights are registered by submitting a hardcopy listing (or tape or disk version which can be made perceivable to the human eye) to the Library of Congress' Copyright Office. The Copyright Office has not decided what to do about commercial data bases that are constantly updated (Betts, 1985f). Under the pre-1976 copyright law, protection would be lost if copyrights were not registered or if a permanent copyright notice was not affixed to the work in the proper place. Although under the new law protection begins from the time of authorship (in theory). Lawyers advise that it is still important to make copyright notices part of software and documentation. Copyright protection will last the lifetime of the author plus 50 years.

Until the 1980 Amendment to the 1976 Copyright Act in the U.S., it was not clear whether binary code was copyrightable, since it had been argued that it was not perceivable to humans. Now copyright protection clearly applies to object code. The 1983 case of Apple vs. Franklin made copyrights applicable to ROM chips and operating system software. The legality of modified copies of software has not yet been clearly established by the courts. The U.S. International Trade Commission, for example, has been struggling to determine when a foreign copy of an Apple computer chip was a copy and when it was a modified new creation (Wallace, 1984). Some attempts have been made to develop techniques for comparing the designs of original and pirated software that has been rewritted to disguise the theft (Glass, 1985).

At issue here is more than the question of how much "software pirates" would have to rewrite code before they were safe from prosecution. Software compatibility within the industry and the rights of customers to modify software to meet their particular needs are also involved. If interface and other compatibility software is found copyrightable, customers will be inconvenienced and the setting of industry standards will be more difficult. If modifications to copyrighted software are made completely illegal, it will be difficult for customers to adapt software specifically for their own installations. If a modification such as supporting a non-standard peripheral device was not legal, smaller equipment manufacturers might suffer.

The disadvantage of copyright protection is that it does not cover the algorithm in a program, nor does it prevent unauthorized use. It only makes copying illegal. Finally, copyrights have not been very successfully enforced, although several companies (Lotus 1-2-3 and Apple, for example) are vigorously pursuing offenders. Although Lotus has received a substantial out-of-court settlement from one company it sued for making multiple copies of its software (McGeever, 1984), the first jail sentence for a software pirate did not occur until 1984 (Bartolik, 1984). Piracy remains a major problem in the industry. The statistics of the Canadian Dealers Association report that 90% of Canada's software programs are illegal (Gunter, 1984).

One problem with enforcing copyright laws is based on the special status of electronic information. In many computer media, one must write the information in order to read it. Since reading information is widely considered to be an individual's right, copyrights on electronically based information seem to many to be an infringement of the right to use information. Within the copyright law there is a "fair use" provision which allows single copies of protected works to be made (under a variety of restrictions) for educational use. Usage is not restricted by copyright law, only the making of copies. However, in the case of many of the new information products, using and copying overlap. This is particularly a problem for networked computer systems. For some network software, each user copies the program (in a technical sense) at every use. It remains to be seen how the courts will resolve this issue. If the case of home audio and video tape recorders is any indicator, we may find the industry abandoning its attempts to prevent copying for personal use and concentrating instead upon preventing copying for resale or by business customers to avoid buying multiple copies.


Because computers facilitate the collection and matching of information about individuals, organizations and goverments, they add a new dimension to the many-sided debate over what sorts of information should be the personal property of individuals and what sorts should be considered business products or shared cultural goods. Computers have enhanced the ability of corporations and law enforcement agencies to monitor the activities of employees, consumers, and citizens. Abuse of such information has created a public debate over the computer's threat to privacy. For example, as more sophisticated mailing lists are made and sold, the volume of junk mail and telephone solitications increases. Many people fear that their privacy will be invaded by government and credit bureau investigations, with possible abuses of their civil liberties. Businesses are concerned about the privacy of their records and seek protection from corporate espionage and what they view as excessive government interference in their activities. Government concerns include the gathering of enough information to monitor business and individual compliance with tax and other laws. Internationally, governments are also concerned with national security and economic competition. Neither absolute privacy not totalitarian control are goals sought by participants in the contemporary debate over computers and privacy. Instead, they have serious disagreements about what balance should exist between individual freedom and cultural demands.

10.2.1 Privacy and Social Control

The social historian, Barrington Moore, defines privacy in two senses. One, a desire for "protection or escape from other human beings, emerges when an individual becomes subject to social obligations that that individual cannot or does not want to meet" (Moore, 1984:268). This concept is found in pre-industrial societies and represents a way for individuals to avoid temporarily and in more or less socially accepted ways the demands of their communities. By our own standards, the opportunities for privacy were often limited -- all bodily functions, for example, might normally be performed in front of other people. "Being sick" is an example of privacy in this sense. If we are ill, we can stay home and avoid social interaction. However, the urgings of our friends to "get well", the invasion of our body by medical personnel, and organizations' insistence on "a note from your doctor" are social control mechanisms that limit our freedom to be privately sick in the interrests of getting us back to our normal social roles (Parsons, 1951: chapters 7 and 10).

When our right to be sick whenever we choose is interfered with by hospitals and officials, we experience invasion of privacy in Moore's second sense. This form of privacy involves the individual's rights against external authorities. In modern societies people seek privacy from organizations and institutions as well as from one another. The privacy debate over computer use for government and corporate record keeping is about this second sense of privacy -- the right of individuals to be left alone by their government and by the economic organizations of their society. This right of privacy is often in direct conflict with the norms governing our obligations to work and obey society's laws and regulations.

Since no society can exist without patterns of expected behavior, and since information about individuals is essential to the functioning of modern social institutions, the privacy debate is not about whether information should be collected. It is about what sorts of data are to be gathered, what they are to be used for, and by whom. The privacy debate is ultimately about information as a source of power for people, corporations, and governments over one another's activities.

10.2.2 The Legal Concept of Privacy

The growth of our legal concept of privacy was based on the idea that the individual, rather than the group, is the basic unit of society. In 1890, Attorneys Samuel Warren and Louis Brandeis published a landmark legal opinion in the Harvard Law Review defining the right to privacy (reprinted in Johnson and Snapper, 1984). In it they show how the common law rights to life and property are intertwined in the privacy issue. The right to life has been extended to "the right to be let alone" to enjoy life. The right to property has been extended to the right to own "intellectual property" -- including information about one's personal life.

Senator Sam Ervin, Chairman of the Senate Subcommittee on Constitutional Rights, locates the right to "receive and impart information and ideas" in the First Amendment of the U.S. Constitution. Private telephone conversations are protected by the Fourth Amendment's ban on unreasonable searches and seizures. In 1973 the Roe v. Wade decision of the Supreme Court extended Fourteenth Amendment protection to the right to give and receive information. To Ervin, "privacy is a catchword for the control that the individual exercises over information about himself" (1983:160).

Interpretations of the Constitution that define corporations to be "persons" extend the privacy debate to the issue of how much freedom businesses have to keep information to themselves. Constitutional limits on the privacy of both individuals and corporations exist because of the governments rights to regulate commerce, conduct censuses, tax, and conduct international relations in the public interest.

10.2.3 Who Knows What about You?

In the early 1960's, one of my college roommates was sent a letter stating: "Send us your social security number, and we'll tell you about your sex life." Two decades later that joke on a computer programmer doesn't seem so funny. Despite the provisions of the Privacy Act of 1974, a 1977 report of the government's Privacy Protection Study Commission identified serious abuses of computerized data. Government Data Collection. Federal use of computers for data collection grew rapidly in the 1960's and 1970's. By the late 1970's, there were about 4 billion records on individual Americans filed with Federal agencies (Bacon and Kelly, 1978). Treasury records (including Internal Revenue Service tax records), Health, Education, and Welfare data (including Social Security information), the Department of Commerce files (including census data) and the data bases of the Justice and Defense Departments were the largest collections. By 1985, according to a federal Office of Technology Assessment report, the government had records on 114 million Americans. 109 million of us are recorded in the data banks of the Defense and Justice Departments (reported in The Boston Globe, Oct. 24:8).

During the same time, business and nonprofit organizations have gathered extensive records on Americans. These are often made available to the government. Record matching of all this information has become increasingly possible with computerization, especially with the faster search times of new hardware.

Credit, insurance and private investigation agencies. The five largest credit bureaus maintain more than 150 milion individual credit records. The Fair Credit Reporting Act of 1970 limits access to credit bureau data to those with a court order, permission of consumer or a valid business, credit, employment or insurance request. Yet many abuses are reported. The way in which credit investigations are done is one source of problems. For instance, one bureau gathered data through such means as sponsoring local welcome wagon ladies, who greet newcomers and report back to credit bureaus their former residence (so that a credit file can be obtained from the previous locale) and their "worthiness" for credit. Another credit bureau, the Retail Credit Company of Atlanta (with 48 million personal records) records "any known connection with a 'peace movement' or any other organization of a subversive type." The errors, to say nothing of the ethics, involved in credit bureau investigating and reporting practices are a major concern. Also, the Supreme Court has consistantly upheld federal agencies' rights to access such records. ("Computers and Dossiers," 1983:230) Banking and Other Financial Information

The Electronic Funds Transfer Act of 1978 was designed to modify the uniform commercial code regulation of paper transactions for banks in order to facilitate electronic banking. Since 1976 (U.S. v. Miller), it has been clear that bank records are the property of the banks, not their customers. Since financial transactions are among the information which many Americans would most like to keep private, the possibility of record matching by government agencies (like the I.R.S.) has been considered a threat to privacy. Although the E.F.T. Act protects bank customers from federal abuse, it does not protect them from state and local authorities (Zaki, 1983). The U.S. Department of Health and Human Services is now matching its welfare roles against state files of interest incomes from banks and other financial institutions ("Federal Computer...," 1984). The U.S. Treasury Department has taken money directly from bank accounts of Social Security recipients when they believe that they have made overpayments ("S.S. Access...," 1983).

In contrast to computerized surveillance of low income and elderly bank customers, the Securities and Exchange Commission does not even record the ownership of stock in U.S. corporations except by people who are empowered to make investments on behalf of others (U.S. Senate, 1980). Mailing Lists

The average American household receives 2.3 pieces of junk mail each week. The mailing list business produces information by selecting individuals from available commercial or government lists whose characteristics make them likely to respond to specific advertising. Expert systems are now available to select the "best"prospects ("Tool Identifies...", 1985). To have your name removed from mailing lists, the Privacy Protection Study Commission recommends that you do not write to the company who mailed you their ad. Often they have only paid for a mailing and do not actually have your name until you reply. Instead, you can be "delisted" by the Direct Mail/Marketing Association's Mail Preference Service.

Many of us have experienced an electronic version of direct mail advertizing in which a computer dials our phone number and plays a recorded message. Even an unlisted phone number may not help if the computer is trying all the numbers in some exchange. In perhaps the most blatant abuse of this technique, hospital patients in an intensive care ward were called by a computer selling life insurance. Employment, Medical and Educational Records

In 1984, the U.S. Department of Housing and Urban Development began a computer matching of low income tenants with state and federal wage data ("Federal Computer...," 1984. The privacy of health records is also threatened by computerization (Olmos, 1984). The federal government, as in the Baby Doe case (over the issue of treating serious birth defects), is making new demands on hospitals to report on the health status of individuals as part of the efforts to regulate treatment. Insurance companies and employers also try to obtain health information about prospective employees. A series of court cases have gradualy eroded the privacy of patient records in the interest of institutional "need to know" (Culliton, 1985). Educational institutions have reported grades, disciplinary actions and current addresses of students and alumni to government agencies and businesses. In the area of workplace safety, however, "Right to Know" legislation in several states places the rights of employees and community residents to have information about environmental hazards above the right of companies to keep the chemical composition of their products secret.

10.2.4 Surveillance and Law Enforcement

In a review of the new technologies of surveillance, Gary T. Marx (1985) illustrates their scope using the lyrics from "Every Breath You Take", a rock song by The Police:

         every breath you take     [breath analyzer]
         every move you make       [motion detection]
         every bond you break      [polygraph]
         every step you take       [electronic anklet]
         every single day          [continuous monitoring]
         every word you say        [bugs, wiretaps, mikes]
         every night you stay...   [light amplifier]
         every vow you break...    [voice stress analysis]
         every smile you fake      [brain wave analysis]
         every claim you stake...  [computer matching]
         I'll be watching you      [video surveillance]

Although citizens in a democratic society are inclined to think "It can't happen here", many of these technologies are being applied by businesses and law enforcement agencies to identify people and monitor their behavior. Identification. Who are you? Can you prove it? Most of us are familiar with showing a driver's license or credit card to prove our identities. Someday you may carry a computer chip on your identification, as "smart cards" replace conventional I.D.'s (McIvor, 1985). According to James Rule and his co-authors, the use of computers is reducing our opportunities to identify ourselves. Instead, our identity is frequently established by matching records in computer data bases.

When two college freshmen with the same name (whose fathers also had the same name) were identified as a single person by my college computer, they were assigned a single dormitory room but sent only one tuition bill. By graduation time they had straightened out the problem and recieved two diplomas. A New Jersey woman was not so fortunate with the F.B.I.'s National Crime Information Center computer. She was mistakenly matched with a Texas welfare fraud case, arrested in front of her co-workers, jailed for over a week, and is now suing the government (Babcock, 1985). This is the same computer available to your local law enforcement officials. If you are arrested for a traffic violation, you may find your identity being matched against national crime data. One approach to resolving abuses of record matching is to provide many identification codes to each person, one for each agency or company they do business with. Security could be maintained, but the records would be unmatchable Chaum, 1985). This was actually what identification was like before widespread use of the social security number as a national ID number.

While some people hope for technical solutions to problems of mistaken identity, professionals familiar with large computer systems realize the impossibility of perfect accuracy. Even if computer identifications could be protected from error, many people find them an invasion of everyone's basic right to privacy. Even if we could have computer-based identification systems without record matching or errors, it would still be a social problem if we answer the question of who people are, not by interacting with them to find out, but by turning to the external authority of technology. Government Surveillance. A report by the U.S. Office of Technology Assessment found 25% of federal agencies (not counting intelligence organizations like the CIA or National Security Agency) plan to use electronic surveillance (Holden, 1985). The report concluded that the two main legal protections against surveillance (contained in the 1968 Omnibus Crime Control and Safe Streets Act and in the 1978 Foreign Intelligence Act) had been overtaken by technological change. For example, under existing law, cordless telephones have not been protected by wiretap legislation -- they have no wires. Documenting the need for additional protection, the report argued:

Over time, the cumulative effect of widespread surveillance... could change the climate and fabric of society in fundamental ways (Office of Technology Assessment, 1985).
Just as technological forms of protection fail to protect information products in the absence of social norms respecting property, so technologies of surveillance and law enforcment cannot make those who use them "safe." If anything, they make society ultimately less safe, because they reify social relationships of morality and trust into technologies of social control. Computers in Law Enforcement. The social institutions which maintain public order are using the capabilities of computers to enhance law enforcement processes, but are also confronted by a rising tide of computer-assisted white-collar crime. These two opposing trends can be summarized as follows: use of computers is making it increasingly difficult to steal $50 from a gas station but increasingly easy to steal $500,000 from a bank. This is because computerized record systems make it easier for law enforcement officials to track criminals who can be identified or who have previous records (including those of us who neglect to pay our parking tickets), but they make it easier for those with access to computerized information to embezzle money.

Correctional institutions using computerized record systems can respond quickly to events like the Attica prison riot (Rosen, 1983). To provide physical security, a prototype robot security guard will eventually have artificial intelligence and remote sensing capability (Rogers, 1984). They are planned for use as military sentries and prison guards, and perhaps even night watchmen. A new electronic monitoring technique, the Gosslink, is being tried to keep track of less dangerous criminals. With the Gosslink, prisoners can remain under "house arrest" in their own homes. The computer attached to their bodies reports any violations -- including taking off the computer (Nordheimer, 1985). A similar device was attached to automobiles in Hong Kong for traffic control and road use tax administration (Parks, 1983). It was also designed to keep track of the movements of individual automobile owners as part of a Travel Record and Immigration Control Enforcement System (Pearce, 1985). The resulting public uproar led to Hong Kong's first data and privacy protection law (Westin, 1985). These and other innovations like the Japanese and San Francisco police's automatic fingerprint identification system offer computerized means of more efficient law enforcement but raise the possibility of threats to civil liberties (Batt, 1984a; Serrill, 1985). Abuses have been charged in the form of false arrests due to computer errors in New Orleans (Raimondi, 1985) and illegal police access to lawyers confidential files on their clients in San Francisco (Sullivan, 1985). Workplace Surveillance. U. S. trucks are being tracked with the same technology that Hong Kong used on automobiles. In this case the surveillance is used to manage drivers' pickup and delivery schedules ("Tracking the Trucks'" 1984). According to interviews with taxi opertors, the skills and satisfactions of driving transfer rather easily to the computer/human interface (Greenberg, 1983). In theory, better information systems could enhance truckers' skills and increase their productivity and pay. But truckers report that this monitoring systems takes discretion way from them. They lose control over choice of routes and delivery times; they feel spied upon by their supervisors (Reineman, 1985). When the technologies of surveillance are applied to work, the result can be loss of employee autonomy.

Yet, when employees have little autonomy, computerized surveillance may be viewed as an improvement. Reports one U. S. postal employee: "If the foreman's behind me constantly, I get nervous. I don't care about the machine (Miller, 1985)." Workplace surveillance by machine depersonalizes relationships of power and authority, reifying them in managerial technologies. It also relocates trust from the employee's personal reports of his or her own behavior to objective measurements of performance. If employees were not trusted in the first place, the machine may be preferred to the human evaluator. In the most extreme case -- that of lie detection technology -- employees accounts of even their intentions are no longer trusted by employers. In its most benign form, computerized surveillance replaces the social indignities of being given orders by another person with the impersonal neutrality of the machine. In its worst form, surveillance establishes relationships of oppression as part of the external reality of the workplace. Computerized Repression. There were no computers in George Orwell's novel of political repression, 1984. Censorship, surveillance, and a Newspeak language from which all subversive words had been removed were effective means of information control for the fictional regime. Yet, had the technology been available, "Big Brother" would surely have used it. In countries with less privacy protection than our own, computers seem to be contributing to a "Big Brother is Watching You" climate of surveillance. For example, German law enforcement agencies are considerably freer than our own to gather and use information about citizen activities (Butner, 1982). One result is that political dissidents are denied government and teaching jobs. In South Africa, computers are used to maintain the apartheid registration system limiting the physical movements of the black majority (Conrad, 1982).

Many of the world's military dictatorships are neither wealthy enough nor sufficiently experiences with electronics to monitor their populations by modern means. However, we must expect technologies of repression to be high priority items in the economic development of nations which maintain social control through force. Despite U.S. laws against exporting computers to be used in the violation of human rights, American technology is in use in South Africa and in countries, like the Philippines, that claim to be oppressing only communist dissidents (Klare and Arnson, 1981). But it is easy for the powerfully oppressive to call whomever they choose a communist. For example, among my Filipino acquaintances are a nun and a priest, both anti-communists, who were arrested and tortured for opposing the government on religious grounds.

With computerized repression, we see the worst social consequences of information as power over others. In the next chapter, we will explore the possibilities of computers used to aid groups in agreeing upon and achieving their goals.

[ Next Chapter | Main Index | Chapter Index ]