Deterministic analysis

Against this backdrop, the unavoidable reality is that information networks have changed the way most people do business, access services and dispense sensitive information. Therefore, appropriate institutional and technical safeguards are required for a broad range of sensitive, or administrator information being sent across these networks. In this context encryption when used by law-abiding individuals and business can help prevent crime.

For example, the use of cryptography to ensure confidentiality, provide reliable user authentication, and detect unauthorized tampering with electronic data9 can help to deter electronic bank fraud and many other types of illegal activity. Furthermore, in Bernstein v. U. S. Dep't of Justice10, a three judge panel of the 9th Circuit recognized that the First Amendment protected encryption source code since it was the best means to express cryptographic ideas and algorithms. Counteracting attempts for "trusted third party" legislation for encryption keys Schneier (1998 Online) writes:

Encryption systems support rather than hinder the prevention and detection of crime. Encryption helps to protect burglar alarms, cash machines, postal meters, and a variety of vending and ticketing systems from manipulation and fraud; it is also being deployed to facilitate electronic commerce by protecting credit card transactions on the Net and hindering the unauthorized duplication of digital audio and video. However, the predicaments concerning encryption and hacking are best illustrated in the Universal Studios Inc v.

Eric Corley11 case. The plaintiff from the Hacker Quarterly Magazine was accused of linking to the DeCss code, a programme that striped encryption from DVD movies. This programme was openly available from the magazines website as a tool created to help LINUX users watch legally purchased movies on their computers. The fundamental issue at the heart of this case is who was responsible for the hacking: Corley for writing the programme, or those who actually used the programme to break the encryption?

The Children's Internet Protection Act (CIPA) was a U. S response to the darker aspects of the Internet12. With pornography, violence, hate speech just a keystroke away CIPA, requires federally funded schools and public libraries to install filters on all computers. Proponents of the bill claim this is an effective tool in blocking access to obscene materials, which is deemed harmful to minors. The family research council suggest that:

CIPA is a necessary and constitutional remedy to a pervasive, nationwide problem in public libraries where children and adults are accessing obscenity and child pornography, adults are exposing children to pornography, and patrons are engaging in indecent exposure and sexual assaults, resulting in a hostile work environment. This attempted legislation was clearly aimed at paedophiles transferring image files across the net. Advocates of filtering claimed that the Internet is different from other forms of broadcasting and therefore should not be regulated by the same censorship standards.

This premises reasoned that filtering would be based on classification supplied by ratings services, parents and teachers, using a variety of systems of value, to protect and shield children from explicit sex predators lurking in cyberspace. This apparently innocuous bill was to meet strong opposition. Librarians and other free speech advocates claim that the legislation goes well beyond restricting children's access to the web and violates the First Amendment rights of all those who may use public computers; this was upheld by the U.S supreme court in United States v. American Library Association.

The contentious points centred on the impacts of some blocking and filtering implementations, on the free flow of ideas, and particularly about the potential for censorship, that these technologies may offer third parties. Challenging, CIPA the Centre for Democracy and Technology (2004 Online) argued that the act raises constitutional problems in the following areas:

It imposes serious burdens on constitutionally protected speech, including materials such as movies and television programs when disseminated through popular commercial Web sites such as PlanetOut also risk restriction under CIPA. It fails to effectively serve the government's interest in protecting children, as it will not effectively prevent children from seeing inappropriate material originating from outside of the US available through other Internet resources besides the World Wide Web, such as chat rooms or email.

It does not represent the least restrictive means of regulating speech, according to the Supreme Court's own findings that blocking and filtering software might give parents the ability to more effectively screen out undesirable content without burdening speech. Congress has produced no detailed record refuting this finding or supporting the notion that CIPA provides the least restrictive means. It seems clear that any attempts to bring about self regulation, or a government controlled rating system in respect of controlling pornography on the Net, would unfortunately prove unachievable.

Countless, hacking techniques could be employed to circumvent any potential threat to this unscrupulous industry. For instance hackers employed by a pornographic website may hijack a computer by planting a programme on the users PC and then advertise the illicit material. This form of behaviour makes regulation and detection of pornographers almost impossible. Attempting to put these issues into some perspective Lawrence Lessig's book (Code and Other Laws of Cyberspace) breaks these considerations down into four modalities of regulation or constraint: norms, law, market and architecture/code.

It's the latter of these modalities architecture/code14, that Lessig address the latent ambiguities inherent in the conflict overlaps of competing sovereigns with interest in behaviour in cyberspace. He reminds government policy makers in a thought provoking declaration that the nature of cyberspace is about to flip from unregulability to regulability, through the use of 'architectures of control. ' Lessig contends that certain values such as user anonymity, free speech and decentralisation have become rooted in the structure of the Net.

However, these values are not innate to the Net; their existence is solely due to the way in which it has been designed. He sees the efficient 'architecture of control' that the Net provides as meaning that we must actually make decisions, and that these decision making processes are, by definition, political. Lessig then proceeds quite compellingly to shows how regulation is possible through a coupling of code and a popular political will designed to choose the very forms of regulation that should constitute Internet regulation.

Moreover, Lessig argues that latent ambiguities result from situations for which the law present no clear guidance, and demands that a choice be made when two conflicting answers/decisions occur15. Given the fact that the U. S constitution was framed in 1791, and the founding fathers "did not have to decide between one way and two way confrontation; [and now] given the conflict of values at stake, it is not obvious how they would decide". Herein lays the precise meaning of the latent ambiguity inherent in contemporary law concerning cyberspace.

It's these latent ambiguities that impede people's ability to understand and act upon some of the more complex issues surrounding the regulation of cyberspace. As such, Lessig pessimistically sees the possibility of software corporations coming to controlling the Internet. Drawing these controversies of jurisdiction, regulation and behaviour together there is no doubt that the global nature of the Internet is reshaping the fixed and firm boundaries between domestic and international spheres, and changing our conceptions of the proper domain of domestic and international law. Katsh (1995, p. 8) asks of international law, "Do these changes make possible new kinds of legal relationships and allow people to interact with the law in new ways? "

Clearly, this relatively new means of communicating at high speed and low cost is presenting an unprecedented opportunity for contact and exchanges between people, with almost complete disregard for national frontiers and the ensuing domestic norms and regulations. What seems clear from the process of law is that something different, from what was perceived to be a slow, and evolutionary process is now in a constant state of flux.

The boundaries previously occupied by the various legal actors are undergoing a fundamental transformation, which is changing the very nature of the institutions and relationships between them. It's precisely here that Lessig fails to grasp the materialist developments, which have rendered the traditional forms of jurisdiction more problematic; these neo-liberal social, economic and political factors are the main arbiter for the latent ambiguities existing within cyberspace.

Secondly, Lessig's analysis has a propensity to reject the continuing importance of the nation state, and leaves no way to account for the nation state, as a mediator between the structures of global finance, and the transnational corporations who produce both the hardware/software that propels the information society. This is none more evident than Lessig's disregard for the telecommunications act of 1996 through which the U. S government handed the Internet over to the private sector.

It's this tendency to view cyberspace and technology as causal agents, which predetermines the social outcomes, while obscuring the political neo-liberal forces effecting the real social change. Lessig's accept and adapt through reforms, justifies any social consequences that result from the implementation of corporate market agendas. This effectively argues that we cannot decide what type of society we want to develop through cyberspace; cyberspace itself will determine the socio-political forms that arise and societies must at best resolve the ambiguities that arise.

In short, states have not been disabled or gutted by technological change or from becoming an agent for effecting jurisdiction within cyberspace; any more than corporate managers have been in effecting neo liberal agendas. Finally, Lessig argues that as a consequence of the decentralised nature of the Net, the four mechanisms for regulation: authority, law, sanctions and jurisdiction make regulation in cyberspace impracticable. Countering this view Klein (2002 p195) states: ICANN realizes these four mechanisms through its control of the Internet's domain name system (DNS).

Although Internet communication has no central control point, Internet addressing, as realized in the DNS, is centralized. DNS provides the control point from which to regulate users. Moreover, the DNS is also an essential resource, so it provides a means of sanctioning users: denial of access to domain names is the equivalent to banishment from the Internet. The DNS also defines jurisdictions on the Internet. The logical organization of the DNS allows authority to be mapped onto distinct zones.

Finally, the contractual foundations of the DNS provide opportunities to promulgate regulations. Taken together, these features render ICANN capable of governance. As Klein indicates it takes very little thought to see how governance could be brought under an institutional centralised body such as ICANN. The point being that by ignoring the nation state and its institutions and attempting to initiate jurisdiction through the regulation of the Net's architecture (code) the main weaknesses in Lessig's techno-deterministic analysis lies.