People to make criticisms that are difficult to state openly, and share information and support about topics that might be stigmatizing, such as addiction or sexual abuse. Unless online anonymity is protected, whistleblowers who want to criticize their employers, parents who want to criticize the principal of their children's school - and many others - may be afraid to speak out. That would be a loss for our country.
This conjecture was endorsed by the U.S Supreme Court, in Reno v ACLU which maintained that the Internet is “a unique and wholly new medium of worldwide communication to which First Amendment protection and no lesser standards apply”. As a consequence, there are human rights issue’s at stake if total anonymity is to be removed by any administration. However, as the number of online user has increased so has the level of criminal behaviour, due in no small part by the right to online anonymity.
The disparity to the Supreme Courts ruling can be seen within the growing number of calls to eliminate not just the more traditional forms of crime, but also so some of the newer forms such as hacking. As such, corporations have been calling for stricter legislation, which can be used internationally to identify anonymous users. Attorney General John Ashcroft (2001 Online) described the challenges anonymity and the Internet present contemporary judicial systems in regards to hacking as follows:
The Internet can provide anonymity. On the Internet, it is easy for a criminal to create a fictitious identity to perpetrate frauds, extortions, and other crimes. Since many computer crimes – such as trading pirated software or child pornography – can be committed entirely on-line, this anonymity can significantly complicate an investigation.
Evidently, the boundary and relationship between the various legal actors, institutions and interest groups is constantly in a state of flux, and any attempts to remove or amalgamate user anonymity into some universally applicable legislation seems not only legally questionable, but also improbable. Clearly, user anonymity and hacking highlights one of the main reasons why cyberspace is so difficult to regulate, and shows the latent ambiguity between traditional law and the legal challenges this form of behaviour has given policymakers.
The above effectively highlights the lack of international governance, or as Barlow forcefully states “governments of the industrial world, you have no sovereignty where we gather” (cited in Lessig 1999 p218). Without a centralised authority existing to make universally applicable sanctions and enforce them, the ungovernability of the Internet is all too evident. Writing on the difficulties of a decentralised Net, Ashcroft (2001) writes, “A criminal anywhere in the world armed with nothing more than a personal computer connected to a modem can victimize individuals and businesses worldwide”. Hans Klein (2002 p193) suggests that: “the reasons for this lie both in the characteristics of the technology, which make control difficult, and in the global reach of Internet communications, which creates jurisdictional conflict among government regulators”. These decentralized predicaments effectively mean, that hackers can transfer data, software, send viruses and deny services to other users. The I Love You viruses sent in may 200 spread like a wildfire through the world’s e-mail systems, destroying files in millions of hard drives globally: It’s estimated that the viruses caused $6 billion damage worldwide. This further raises the question of how much anonymity societies can tolerate? Evidently, traditional forms of law “no longer offer an adequate intellectual framework in which to consider the nature and form of regulation in cyberspace” (Jones & Basu 2003). This dilemma of geographical jurisdiction and legitimacy in cyberspace has been a major cause for concern regarding escape from jurisdiction of offenders. The case of R v Homsett illustrates the jurisdictional issues this form of behaviour creates for UK courts in such a decentralized medium as the Net.
A further point of contention concerning the Internet and the potential to monitor criminal behaviour is encryption. By its very nature, encryption is a generic name for numerous means of encoding computer, voice or other transmissions of data so as to conceal the contents from unauthorized access. What’s most at issue here is balancing the needs of the individual and the private sector, against matters of national security. Firstly, the U.S Centre for Security Policy (1997 Online), analysing the risk factors of encryption argue that:
Even the most sophisticated software currently available, featuring 128-bit coding that is judged to be unbreakable using available decrypting techniques, could be provided to foreign purchasers irrespective of whether they may be: potentially hostile foreign governments' militaries and espionage services, proliferators of weapons of mass destruction, terrorist organizations, drug-traffickers, organized crime or other threats to U.S. interests.
The fear of most government law enforcement agencies is that encryption no only protects and masks criminal behaviour, but also makes detection and prosecution practically impossible. Furthermore, wiretaps and electronic monitoring, a critical function to the rule of law in any civil society is seriously compromised by encryption. This has seen a variety of "key recovery," "key escrow," and "trusted third-party" encryption requirements being suggested in recent years by government agencies seeking to conduct covert surveillance within the changing environments brought about by new technologies.
Against this backdrop, the unavoidable reality is that information networks have changed the way most people do business, access services and dispense sensitive information. Therefore, appropriate institutional and technical safeguards are required for a broad range of sensitive, or administrator information being sent across these networks. In this context encryption when used by law-abiding individuals and business can help prevent crime. For example, the use of cryptography to ensure confidentiality, provide reliable user authentication, and detect unauthorized tampering with electronic data can help to deter electronic bank fraud and many other types of illegal activity. Furthermore, in Bernstein v. U.S. Dep’t of Justice, a three judge panel of the 9th Circuit recognized that the First Amendment protected encryption source code since it was the best means to express cryptographic ideas and algorithms. Counteracting attempts for “trusted third party” legislation for encryption keys Schneier (1998 Online) writes:
Encryption systems support rather than hinder the prevention and detection of crime. Encryption helps to protect burglar alarms, cash machines, postal meters, and a variety of vending and ticketing systems from manipulation and fraud; it is also being deployed to facilitate electronic commerce by protecting credit card transactions on the Net and hindering the unauthorized duplication of digital audio and video.
However, the predicaments concerning encryption and hacking are best illustrated in the Universal Studios Inc v. Eric Corley case. The plaintiff from the Hacker Quarterly Magazine was accused of linking to the DeCss code, a programme that striped encryption from DVD movies. This programme was openly available from the magazines website as a tool created to help LINUX users watch legally purchased movies on their computers. The fundamental issue at the heart of this case is who was responsible for the hacking: Corley for writing the programme, or those who actually used the programme to break the encryption?
The Children’s Internet Protection Act (CIPA) was a U.S response to the darker aspects of the Internet. With pornography, violence, hate speech just a keystroke away CIPA, requires federally funded schools and public libraries to install filters on all computers. Proponents of the bill claim this is an effective tool in blocking access to obscene materials, which is deemed harmful to minors. The family research council suggest that:
CIPA is a necessary and constitutional remedy to a pervasive, nationwide problem in public libraries where children and adults are accessing obscenity and child pornography, adults are exposing children to pornography, and patrons are engaging in indecent exposure and sexual assaults, resulting in a hostile work environment.
This attempted legislation was clearly aimed at paedophiles transferring image files across the net. Advocates of filtering claimed that the Internet is different from other forms of broadcasting and therefore should not be regulated by the same censorship standards. This premises reasoned that filtering would be based on classification supplied by ratings services, parents and teachers, using a variety of systems of value, to protect and shield children from explicit sex predators lurking in cyberspace. This apparently innocuous bill was to meet strong opposition.
Librarians and other free speech advocates claim that the legislation goes well beyond restricting children’s access to the web and violates the First Amendment rights of all those who may use public computers; this was upheld by the U.S supreme court in United States v. American Library Association. The contentious points centred on the impacts of some blocking and filtering implementations, on the free flow of ideas, and particularly about the potential for censorship, that these technologies may offer third parties. Challenging, CIPA the Centre for Democracy and Technology (2004 Online) argued that the act raises constitutional problems in the following areas:
It imposes serious burdens on constitutionally protected speech, including materials such as movies and television programs when disseminated through popular commercial Web sites such as PlanetOut also risk restriction under CIPA. It fails to effectively serve the government's interest in protecting children, as it will not effectively prevent children from seeing inappropriate material originating from outside of the US available through other Internet resources besides the World Wide Web, such as chat rooms or email. It does not represent the least restrictive means of regulating speech, according to the Supreme Court's own findings that blocking and filtering software might give parents the ability to more effectively screen out undesirable content without burdening speech. Congress has produced no detailed record refuting this finding or supporting the notion that CIPA provides the least restrictive means.
It seems clear that any attempts to bring about self regulation, or a government controlled rating system in respect of controlling pornography on the Net, would unfortunately prove unachievable. Countless, hacking techniques could be employed to circumvent any potential threat to this unscrupulous industry. For instance hackers employed by a pornographic website may hijack a computer by planting a programme on the users PC and then advertise the illicit material. This form of behaviour makes regulation and detection of pornographers almost impossible.
Attempting to put these issues into some perspective Lawrence Lessig’s book (Code and Other Laws of Cyberspace) breaks these considerations down into four modalities of regulation or constraint: norms, law, market and architecture/code. It’s the latter of these modalities architecture/code, that Lessig address the latent ambiguities inherent in the conflict overlaps of competing sovereigns with interest in behaviour in cyberspace. He reminds government policy makers in a thought provoking declaration that the nature of cyberspace is about to flip from unregulability to regulability, through the use of 'architectures of control.' Lessig contends that certain values such as user anonymity, free speech and decentralisation have become rooted in the structure of the Net. However, these values are not innate to the Net; their existence is solely due to the way in which it has been designed. He sees the efficient ‘architecture of control’ that the Net provides as meaning that we must actually make decisions, and that these decision making processes are, by definition, political. Lessig then proceeds quite compellingly to shows how regulation is possible through a coupling of code and a popular political will designed to choose the very forms of regulation that should constitute Internet regulation.
Moreover, Lessig argues that latent ambiguities result from situations for which the law present no clear guidance, and demands that a choice be made when two conflicting answers/decisions occur. Given the fact that the U.S constitution was framed in 1791, and the founding fathers “did not have to decide between one way and two way confrontation; [and now] given the conflict of values at stake, it is not obvious how they would decide”. Herein lays the precise meaning of the latent ambiguity inherent in contemporary law concerning cyberspace. It’s these latent ambiguities that impede people’s ability to understand and act upon some of the more complex issues surrounding the regulation of cyberspace. As such, Lessig pessimistically sees the possibility of software corporations coming to controlling the Internet.
Drawing these controversies of jurisdiction, regulation and behaviour together there is no doubt that the global nature of the Internet is reshaping the fixed and firm boundaries between domestic and international spheres, and changing our conceptions of the proper domain of domestic and international law. Katsh (1995, p.8) asks of international law, “Do these changes make possible new kinds of legal relationships and allow people to interact with the law in new ways?” Clearly, this relatively new means of communicating at high speed and low cost is presenting an unprecedented opportunity for contact and exchanges between people, with almost complete disregard for national frontiers and the ensuing domestic norms and regulations. What seems clear from the process of law is that something different, from what was perceived to be a slow, and evolutionary process is now in a constant state of flux. The boundaries previously occupied by the various legal actors are undergoing a fundamental transformation, which is changing the very nature of the institutions and relationships between them. It’s precisely here that Lessig fails to grasp the materialist developments, which have rendered the traditional forms of jurisdiction more problematic; these neo-liberal social, economic and political factors are the main arbiter for the latent ambiguities existing within cyberspace.
Secondly, Lessig’s analysis has a propensity to reject the continuing importance of the nation state, and leaves no way to account for the nation state, as a mediator between the structures of global finance, and the transnational corporations who produce both the hardware/software that propels the information society. This is none more evident than Lessig’s disregard for the telecommunications act of 1996 through which the U.S government handed the Internet over to the private sector. It’s this tendency to view cyberspace and technology as causal agents, which predetermines the social outcomes, while obscuring the political neo-liberal forces effecting the real social change. Lessig’s accept and adapt through reforms, justifies any social consequences that result from the implementation of corporate market agendas. This effectively argues that we cannot decide what type of society we want to develop through cyberspace; cyberspace itself will determine the socio-political forms that arise and societies must at best resolve the ambiguities that arise. In short, states have not been disabled or gutted by technological change or from becoming an agent for effecting jurisdiction within cyberspace; any more than corporate managers have been in effecting neo liberal agendas.
Finally, Lessig argues that as a consequence of the decentralised nature of the Net, the four mechanisms for regulation: authority, law, sanctions and jurisdiction make regulation in cyberspace impracticable. Countering this view Klein (2002 p195) states:
ICANN realizes these four mechanisms through its control of the Internet’s domain name system (DNS). Although Internet communication has no central control point, Internet addressing, as realized in the DNS, is centralized. DNS provides the control point from which to regulate users. Moreover, the DNS is also an essential resource, so it provides a means of sanctioning users: denial of access to domain names is the equivalent to banishment from the Internet. The DNS also defines jurisdictions on the Internet. The logical organization of the DNS allows authority to be mapped onto distinct zones. Finally, the contractual foundations of the DNS provide opportunities to promulgate regulations. Taken together, these features render ICANN capable of governance.
As Klein indicates it takes very little thought to see how governance could be brought under an institutional centralised body such as ICANN. The point being that by ignoring the nation state and its institutions and attempting to initiate jurisdiction through the regulation of the Net’s architecture (code) the main weaknesses in Lessig’s techno-deterministic analysis lies.
The UK Regulation and Investigatory Powers Act 2000, makes available to the government widespread powers which include the right to monitor peoples Internet activities. The right to demand ISP’s provide access to customers’ communications in secret. Prevents the need for interception warrants and approves any data collect from surveillance does not have to be revealed in court. Allows the right to encryption keys held by third parties.
This UK act was used in the hacking case R v Christopher Pile [1997] M2 Presswire, 24 March 1997
This conflict of individual and societal interests is best emphasised in: Melvin v. Doe [2000], 49 Pa. D. & C. 4th 449, 477. This U.S case removed first amendment protection and the right to anonymity of Internet users engaged in defamatory behaviour.
The Full text of Articles are Available at:
Reno v. ACLU [1997] 117S. Ct. 2329, 138 L.Ed. 2d 874
Prof Lessig contrasts: Olmstead v United States [1928] 227 U.S 438, and Katz v United States [1967] 389 U.S 347 to demonstrate latent ambiguity, for further explanation see:
Computer Misuse Act [1990] s4-s9 contains complex provisions relating to jurisdiction and extradition
R v Homsett [1985] Crim L R 369
The estimated cost of the Melissa virus was estimated at $80 million.
Bernstein v. U.S. Dep’t of Justice [1999] 176 F.3d 1132, 1140-41 (9th Cir. 1999), reh’g granted enbanc and opinion withdrawn, 192 F.3d 1308 (9th Cir 1999). For a more detailed account see:
Universal Movie Studios inc v Eric Corley [2001] App LEXIS 25330 at 73 (2nd Cir 2001)
Within the U.K this form of behaviour is covered by the PCA [1978] s1 (1), amended by the CJPOA [1994] encompassing pseudo photographs
United States v. American Library Association [2003], No. 02-361 (U.S Sup. Ct 2003)
For a more detailed account of this see: Lessig, L (1998) What things Regulate Speech: CDA 2.0 VS Filtering: Available at:
Campbell v. Mirror Group Newspaper [2002] EWCA Cir 1373 and [2002] EMLR 30 (QBD) highlights the U.K’s conflict between freedom of expression and privacy.