Avoiding Roadblocks on the Information Highway

Models of Content Regulation for the Internet

© David E. Swayze, 1996

The following document is protected by Copyright. It may not be reproduced in whole or in part without the express written permission of the author, except for properly credited excerpts in scholarly articles, reviews, or reports. In the event that any portion of this document is used please notify the author by E-Mail at webmaster@davidswayze.com or in writing at 704-14th Street, Brandon, Manitoba, Canada, R7A 4V3. Please note that the Hypertex References in this document are also copyrighted by their respective authors and should be credited accordingly.

Blue Ribbon Graphic Courtesy of the Electronic Frontier Foundation

You are visitor number since September 1, 1996.


Table of Contents

1. Introduction

2. Fundamental Principles of Freedom of Expression

3. The Conflict

4. The Telecommunication Models

(i) The One Way Channel Model

(ii) The Open Access Model

5. Some Technical Information

6. Models of Internet Regulation

(i) The Source Control Model

(ii) The Customs/Border Control Model

(iii) The Provider Control Model

(iv) A Brief Summary

(v) The Self Regulatory Model

(vi) The End User-Model

(vii) The Modified End-User Model

7. Conclusion

Endnotes


If all mankind minus one, were of one opinion, and only one person were of the contrary opinion, mankind would be no more justified in silencing that one person, than he, if he had the power, would be justified in silencing mankind.

John Stuart Mill, On Lilberty1

1. Introduction2

The "Internet," "Information Superhighway," "Cyberspace," are all words or phrases that have inundated our vernacular over the past few years. Anyone who pays even passing attention to the media will have at least some familiarity with the Internet. Unfortunately, along with being known as a new, efficient, means of mass communication and information gathering, the Internet has also become synonymous with pornography, paedophilia, and hate mongering. Although pornography, and hate speech make up a very small portion of the information available on the Internet, it has received the bulk of publicity in the popular media with the result that the mostly uninformed populace is now clamouring for regulation of the Internet.

The purpose of this essay will be to examine regulation of content on the Internet. I am beginning with the fundamental assumption that the Internet requires some form of regulation. I will propose various models of Internet control and examine both their practical efficacy as well as the potential impact each model has on the values of freedom of expression. I will examine how each model is either too intrusive on freedom of expression, and/or is simply impractical to implement given the current state of the technology. I will argue that the Internet regulation model which least interferes with freedom of expression is the "modified end-user model," as it leaves the ultimate determination of what information is viewed up to the user while recognizing the need for some technological controls to assist the user in making informed decisions and to protect those unable to make such determinations for themselves.

In the process I will look at how the Internet is unique in that it presents us with a method of communication different from those forms which technology has thus far presented to us and, therefore, old regulatory approaches are inadequate. In other words, the Internet cannot be treated the same as Radio or Television for the purposes of regulation.

2. Fundamental Principles of Freedom of Expression

The view of freedom of expression which is viewed as the model upon which the liberal principles of freedom of expression are based, is that held by the English philosopher, John Stuart Mill and propounded in his essay, On Liberty.3 Mill's basic premise is that in order to reach truth, there must be no limits on the expression of ideas. He argues that human beings are by nature fallible. Therefore, any attempt to make determinations as to what ideas may be expressed and what ideas may not be expressed, on the basis of their truthfulness leaves one open to the risk that their judgements may be wrong:

First: the opinion which it is attempted to suppress by authority may possibly be true. Those who desire to suppress it, of course, deny its truth; but they are not infallible. They have no authority to decide the question for all mankind, and exclude every other person from the means of judging. To refuse a hearing to an opinion, because they are sure that it is false, is to assume that their certainty is the same thing as absolute certainty. All silencing of discussion is an assumption of infallibility. [emphasis in original]4

Furthermore, Mill argues that truth is only attained through the constant questioning and criticism of any given idea. Only after an idea has withstood all testing can we be reasonably assured of its truthfulness.5

His view is also based on the idea that the truth will survive all attempts to suppress it:

The real advantage which truth has, consists in this, that when an opinion is true, it may be extinguished once, twice, or many times, but in the course of ages there will generally be found persons to rediscover it, until some one of its reappearances falls on a time when from favourable circumstances it escapes persecution until it has made such head as to withstand all subsequent attempt to suppress it.6

He argues that the same holds true for heretical opinions. Rather than being extinguished, attempts at suppressing undesirable or controversial opinions merely cause "...men to disguise them, or to abstain from any active effort for their diffusion."7 The result is not that they fade away but that they "smoulder in the narrow circles of thinking and studious persons among whom they originate, without ever lighting up the general affairs of mankind with either a true or deceptive light."8 Suppression of such ideas only results in pacification of those individuals who desire their suppression and in no way furthers the knowledge of mankind. Suppression or censorship only preserves the status quo. Suppression of ideas does not lead to truth and hurts the virtuous thinkers far more than the heretical.9

No one can be a great thinker who does not recognize that as a thinker it is his first duty to follow his intellect to whatever conclusions it may lead. Truth gains more even by the errors of one who, with due study and preparation, thinks for himself, than by the true opinions of those who only hold them because they do not suffer themselves to think.10

As a consequence, Mill and close followers of Mill would take the view that no ideas should be suppressed as the result is a stifling of truth. The Canadian Charter of Rights and Freedoms echoes this approach in s. 2(b) which states:

2. Everyone has the following fundamental freedoms:

. . .

(b) freedom of thought, belief, opinion and expression, including freedom of the press and other media of communication;11

In interpreting this section, the Supreme Court of Canada has held that there are primarily three core purposes served by its protection of freedom of expression. These are, in addition to the promotion and search for truth noted above, participation in the political process and self-fulfilment of the individual.12 Freedom of expression, therefore, is seen as a tool inextricably linked to the democratic process. Without it, our society cannot strive for intellectual enlightenment, govern itself in a manner consistent with the goals and wishes of the people, and the individual cannot full reach his or her potential within society.

3. The Conflict

The problem, however, arises where these core values of freedom of expression conflict with other core values. Most notably the conflict is evident where the values of freedom of expression are seen as protecting the expression of ideas which are not only considered to be morally repugnant by most individuals, but also seen as harmful. In particular, where freedom of expression is seen to protect pornographic materials which may be harmful to women or children,13 or the expression of hate propaganda harmful to certain racial, ethnic or religious groups.14 In Keegstra, freedom of expression was not found to include the freedom to promote hatred. However, in Butler a great deal of protection was given to pornography.

The Supreme Court's approach has generally been a content neutral approach. In other words, the court looks to whether the activity which seeks protection is expressive and conveys a meaning. The content of the message conveyed is irrelevant to the consideration of whether the activity receives protection.15 This is intended to cast the net of s. 2(b) protection widely to encompass all manner of expressive content with the result that only violent forms of expression are not considered subject to prima facie s. 2(b) protection.16 The content neutral approach is most apparent with respect to s. 2(b) protection of pornography where the Court has held that although the meaning conveyed may be of little redeeming social value and be negatively received by some audiences, it is an expression of a meaning by the producer of the pornographic video, photograph, or writing, which is still entitled to protection under s. 2(b).17

The Internet poses new concerns with regard to freedom of expression issues. It is undeniable that ideas conveyed on the Internet, just like ideas conveyed in a newspaper or advertisement, convey a meaning. The Internet is nothing more than a new means of communicating ideas. As a consequence, the Internet can be subject to s. 2(b) protection and scrutiny. In addition to being a forum for conveyance of ideas which traditionally fall within s. 2(b) protection such as political opinion, the Internet has become a forum for those ideas which have also been the subject of controversy when expressed in a more conventional form. Pornographers, for example, were quick to recognize the potential of the Internet for the distribution of pornographic materials. Hatemongers such as Ernst Zundel have also taken advantage of the Internet for purpose of spreading their views.18

The problem is that the Internet, as a new communications medium, poses new challenges for freedom of expression. Traditional approaches which focus on criminal prosecution of those who produce, possess, or distribute hate propaganda or pornography do not readily lend themselves to the Internet. This is because the Internet is not confined to a given territorial jurisdiction, the producers and distributors of the material may not reside within the jurisdiction, and the sheer volume of information available makes it very difficult to monitor its flow and even, determine where the material is being stored. Consequently, it is difficult for a provider of Internet services to monitor what information is being stored on its system or being accessed through its system and this may lead to the result of many possessors and distributors (i.e. providers) being guilty of criminal acts without even realizing it.

Simultaneously, the need for some form of regulation is arguably greater than ever. This is due to the fact that the Internet knows no boundaries, is theoretically accessible to many billions of people and as of yet provides information from an incalculable number of sources in its rawest and most basic form. There are, as of yet, no controls on the types of material that can be made available on the Internet and few means by which anyone can control how it is accessed when it is made publicly available.19 The users of the Internet range from children to seniors and encompass all racial, religious and ethnic groups. As a consequence, each user theoretically has more or less equal access to all information. Children in theory, therefore, can access pornography just as easily as an adult. This statement is qualified as, in fact, there have been attempts made to shield children from access to pornography and these attempts will be discussed in more detail later in this essay.

The ramifications, as will soon be shown, are significant in that the Internet makes available to the individual, the means to communicate his or her point of view to the world cheaply and effectively. The result being that, in addition to the many positive uses of the Internet, there can also be negative uses of the Internet. The purpose of this essay, therefore, is to look at ways in which these negative uses (eg. propagation of pornography and hate speech) can be controlled in a manner consistent with the new technology and consistent with the values of freedom of expression discussed above.

4. The Telecommunication Models

In order to completely understand the nature of the challenges faced by the new Internet technology, it is helpful to understand how the Internet is different from conventional forms of communication. Jerry Berman and Daniel J. Wetizner, in their December 1994 presentation to the Yale Law Journal Symposium, "Emerging Technology and the First Amendment"20 effectively drew this distinction. They defined two communication models, the One Way Channel Model, and the Open Access Model, respectively referring to most conventional forms of communication and the Internet. Their analysis is extremely useful for understanding how the Internet, by virtue of its technological difference, poses new problems for regulation of expression on the Internet. As a consequence, for the purposes of this section, I will borrow heavily from their analysis.

(i) The One Way Channel Model

The channel model is chiefly defined by two characteristics, the scarcity of communication pathways and the presence of information gatekeepers.21 Prime examples of this model include television, radio, and newspapers. The scarcity of communication pathways and the presence of information gatekeepers go hand in hand, as one leads to and makes necessary the other. Where access to the medium is limited, there must be some means of deciding who among the many who desire access, get access.

The scarcity of information pathways, in more concrete terms, refers to the limited number of "channels" available to convey information. For example, the number of available television channels and satellite uplinks is finite. Anyone can print up a brochure and distribute it on a street corner. But, not everyone will have their letter printed in the Globe and Mail or will receive half an hour or even thirty seconds of television time.

As a consequence of this scarcity of information pathways, there must be information gatekeepers, for example the editors of newspapers or the producers of television programmes, for example, to control access to the channels. In addition, the government is able to play a greater regulatory role as it is entrusted with the distribution of the limited number of frequencies and television and radio licenses. It can therefore regulate, to a much greater degree, what information gets air time. The result being that some viewpoints will not be heard as they will be denied access by the information gatekeepers.22

The channel model requires gatekeepers for another reason--the centralized mode of distribution. The channel model is characterized by its system of distribution i.e. that all information is collected at a central source in order to be redistributed. The result is that independent producers must incur great cost in getting their material to the central distribution point. They must also incur additional costs in paying the gatekeeper to broadcast their material and in negotiating carriage agreements with the operator of the network.23

Given our marketplace approach to programming, practically speaking, the views that receive the most airtime are those views which are sustainable in the marketplace. The expense of publishing a newspaper or running a television station is considerable. Therefore, it is necessary that the content of the programming be such that it attracts as large an audience as possible to ensure the advertising or sales revenue necessary in order to be self-sufficient. Even where there are many channels, the result will be a pandering to the lowest common denominator as this is what is self-sustaining. Rather than encouraging diversity, the system encourages conformity. As Berman and Weitzner state:

An increase in channels may bring a partial increase in the diversity of sources available to the public, as a practical matter, however, channels will be used up by the programming that brings the channel operator the most revenue. For example, evan a 500 channel cable television system is unlikely to offer 500 different programs to viewers. More likely, some large number of channels will be used for staggered showings of the top ten or twenty movies. Under this model, even a large number of channels will be used up relatively quickly, and a diversity problem will remain.24

The channel model is also characterized by offering primarily one way communication. There are a small number of information producers or providers relative to a large number of information consumers. The information producers have access to the channels through the gatekeeper and benefit from the ability to distribute their ideas. Whereas the consumers are mere passive recipients of the programming with little input, other than their viewing habits, in determining the actual content of the material available.

In summation, therefore, the one way channel model is characterized by the presence of limited information pathways or channels, which necessitates control by information gatekeepers. The result being that there is limited diversity of information available and great costs incurred in the distribution of such information, especially for independent producers. Under this models, the producers greatly outnumber the consumers and only the producers have access to the means of distribution with the result the communication is one way. This model also lends itself readily to government regulation as the sources of the information are readily ascertainable, and the distribution points centralized. Government can therefore control the distribution of information at its source through imposing limits on content or by controlling, in the case of radio and television communication, licensing of the channels.

(ii) The Open Access Model

The open access model is a significant departure from the one way channel model because it is a two way medium and therefore, holds the greatest promise for the goal of information diversity. The open access model is characterized by the capacity for an unlimited number of channels or information sources, and a decentralized network with no need for information gatekeepers.25

An open access network is limited only in terms of its users. Anyone who is a user of the network is both an information consumer and an information provider or producer. Every individual on the network has more or less equal access to every other member on the network. This is because the network makes no distinction between those who provide information and those who consume information. Most users play both roles:

All who obtain access have the option of making information available to all other users on the network; thus, the sources of information available are limited only by the number of users who seek access. Cable television or satellite networks in contrast, are designed to add users relatively easily, but those users have no ability to send information to others on the network.26

Open access networks are also decentralized in the sense that information need not be collected at a certain point for distribution.27 Any user can send information to any other user or groups of user directly. As a consequence of this decentralization, and unlimited number of channels, there is no need for information gatekeepers to control who has access. Individual users need not make any advance negotiations with an information gatekeeper prior to distribution. An added advantage is that the information provider can place their information on the network to be available upon demand.28 The consumer need not wait until the gatekeeper decides when that information will be made available.

This is the fundamental principle upon with the World Wide Web operates. The World Wide Web, or "WWW", or simply, "the Web" is essentially another name for the Internet. However, it is associated with the idea that information is available for access at any time. Users, or "surfers" as they are often called, can use their computer to search for and access information at any time. In practical terms, the information provider places their information on a computer (physically, the information is encoded as a file on a computer hard drive and security protections are set so that anyone can access that particular file). Each computer has a unique address, much like a phone number or mailing address. A consumer then instruct his computer to contact the provider's computer by providing it with the unique address of the provider's computer. The consumers computer, upon contacting the provider's computer, then requests the information and displays it to the consumer. This happens within seconds or minutes and happens independently of the provider once the provider makes the information available.

The effect is the great potential for diversity as it encourages the flow of ideas between all users on the Internet. Without a centralized distribution point, it becomes much harder for a gatekeeper--be that a network operator or government--to control access. Since there is no single source or small identifiable group of sources, government will find it much harder to regulate the Internet the same way it regulates television, for example, by controlling the information at the source i.e. the individual user.29

Furthermore, the cost of access is significantly lower. For many university students, for example, the cost of access is free or built into tuition. Individuals can now purchase Internet access for a small monthly fee comparable to the cost of owning a telephone or subscribing to cable TV. With that fee comes the ability to be an information consumer and provider.30

In summary, the open access model is distinct in that it is not limited by the number of channels. Furthermore, its users can take on the role of both information providers and information consumer. Each user has equal access to all other users on the network. There is no need for gatekeepers, both because the channels are unlimited and because distribution is not centralized. This has the advantage of significantly lower costs for the information provider and less control over the content of the material the provider makes available. But, this aspect is a double edged sword in the sense that it becomes more difficult to regulate the content of the information available on the Internet. It is this problem which I will examine in this essay.

5. Some Technical Information

Before really delving into an analysis of the models of Internet regulation, it may be helpful to have a little better understanding of what the Internet is and how it operates. A brief introduction to the Internet's history and its purpose is also helpful in understanding why it is that the Internet is what it is today and why it is difficult to regulate.

In a nutshell, the Internet is a network of computers connected together through telephone lines, fibre-optic cables, satellite links, and radio links. The Internet permits computers to transfer information between each other. All the computers on the network can communicate to all of the other computers on the network. Each computer has a unique address or name so that the computers can find and identify each other on the network.

The Internet, however, is unique in the way in which these computers are connected. Originally, the Internet was designed by the US Defence Department and was known as ARPAnet. The network was a research experiment. It was designed with the aim of developing a network which was impervious to nuclear attack.31 As Ed Krol writes:

In the ARPAnet model, communication always occurs between a source and a destination computer. The network itself is assumed to be unreliable; any portion of the network could disappear at any moment . . . . The philosophy was that every computer on the network could talk, as a peer, with any other computer.32

The result was that if any one portion of the network was destroyed, the entire network was not destroyed with it. Today, the Internet really consists of a multitude of smaller networks all interlinked--the Internet is a "network of networks".33 What they all share is adherence to a certain set of standards, known as the TCP/IP protocol34 which is much like saying all computer communicate in the same language. By adhering to this standard different makes of computers, running different operating systems, can all communicate with each other over the Internet. By being composed of many smaller networks, the failure of one network or one computer does not result in the failure of the whole network. Perhaps, in the worse case scenario, a particular computer, or group of computers would lose the ability to communicate with the rest of the network, but the network overall would remain sound.35 In fact, with this system the network has the ability to detour around damage, finding an alternative route through a different network to ensure reliable communication. Much like it is often possible to drive from point A to point B via different streets or highways,36 it is possible to transmit information from one computer to another via different network links. As a consequence, it is very easy to get connected to the Internet and relatively cheap.37 You need only contact someone else who is connected and see if they are willing to let you hookup. You then need only pay for the cost of your connection, and perhaps a fee to the person you are connecting to to offset some of their costs. You then become part of the network, much like connecting your driveway to the street.

The Internet has grown at a phenomenal rate, from a few military installations and universities in the 1970s to millions of users today. The growth rate really took off in the 1980s when universities began connecting in droves.38 Today, the precise number of users is beyond calculation, simply because it is difficult to know exactly how many individuals are connected. Estimates are that there are millions of users and growing.39

These issues will be important when looking at Internet regulation because they have a significant impact on the ability to control who uses the Internet, to control the source of information, and to control the destination of information. It gives rise to many practical problems, which in turn lead to significant problems for issues of freedom of expression. As will be seen, certain models of control would require substantial interference with privacy and communication, in general, in order for them to be effective.

6. Models of Internet Regulation

This paper wishes to propose and analyse three models of Internet control, the source control model, the custom/border control model, the provider control model, the self-regulatory model, the end-user model and the modified end-user model. It should be noted that all of these models are pro-active. In other words, they attempt to block the distribution or receipt of obscene and offensive information as opposed to more traditional reactive approaches (such as criminal sanction) which only has an effect after distribution or receipt. All models offer pros and cons, however, the primary concern of this essay is the practical considerations of implementation of each model and the degree of impact each model would have on freedom of expression.

(i) The Source Control Model

The source regulation model attempts to control information at its source. It is premised on the notion that the best means of preventing the dissemination of information deemed undesirable is to stop it at its source. In practical terms, this would mean preventing access to the Internet by individuals or groups which desire to spread offensive information. This could be attained through requiring the licensing of individuals or groups which desire to access the Internet40 and/or through the screening of materials before placing it on the Internet.

This model, as well as the two which follow it, are a reflection of an approach to Internet regulation which is locked into conventional attitudes of communication. It is better suited to the one way channel model as discussed by Berman and Weitzner as it is conducive to situations where there is a centralized system of distribution. With each individual user of the Internet being a source, it would be impractical to attempt to control and regulate the information posted by each indvidual. It is important to keep in mind the myriad of ways in which someone can express an opinion to a wide audience on the Internet. The most basic means is through electronic mail, or E-mail. E-mail, just like its paper based cousin, letter mail, involves writing a letter or message and sending it to another particular individual on the network. The difference between E-mail and letter mail is that E-mail doesn't require a stamp, and is quite a bit faster. An electronic mail message can reach its destination, anywhere in the world, in a matter of minutes or seconds. To attempt to control electronic mail would require someone or some computer, screening every E-mail message for content deemed objectionable. The volume involved would be staggering as E-mail is by far the most used of all Internet features.

As mentioned before in footnote 30, USENET also provides an outlet for individuals to express their opinions. Admittedly, in some ways, USENET is also more amenable to regulatory controls and there are regimes already in place which do control the posting of messages on some newsgroups. The source control model would involve stopping the posting of certain messages at their source. This again, brings in the requirement of screening at the point that the message is sent to be posted, along with concerns of impracticality. Note that there are in excess of 7,000 individual newsgroups,41 some of which receive hundreds of messages per day from individuals all over the world. The sheer volume soon becomes unmanageable.

USENET has taken some steps to control content through the creation of "moderated" newsgroups. These newsgroups have someone who has volunteered to screen all messages posted to the group. There is no way around it as only that individual has the ability to actually authorize the distribution of a message. Individuals wishing to post must E-mail the messages to the moderator or use software which does this automatically in lieu of posting.42 Those messages which do not conform to the charter or guidelines of the newsgroup are not posted. Ultimate discretion, however, usually resides with the moderator. It should be noted that moderated newsgroups make up only a very small minority of those groups available, with the remaining newsgroups have no means to control the content of messages.

The source control model imposes a gatekeeping function on an open access model system. How does this affect freedom of expression? The source control model would require the policing of all communications on the Internet to ensure compliance with rules; an insurmountable task. The analogy drawn is that of attempting to monitor the conversations taking place in an restaurant.43 It would concentrate a great deal of power in the hands of those charged with the monitoring of such information. The risk of arbitrary and inconsistent application of the rules would be great. The problem would then become how do the monitors respond to attempts to post forbidden information? Do they prosecute the individual or do they merely prevent distribution of the message? Would there be recourse whereby the individual affected could appeal? Likely, such a method would severely restrict the free flow of information by placing artificial barriers in its path, causing delay and confrontation as users wait for clearance and contest decisions to censor.

There are obvious issues of privacy. Section 319(2) of the Criminal Code explicitly excludes private conversations from its prohibition of hate speech. There are a myriad of ways in which a private conversation could take place on the Internet. For example, through Internet Relay Chat (IRC)44 users can engage in private conversations. The unique power of IRC is its ability to allow many users engage in conversation privately and in real time. At what point do the conversations no longer become private and instead become public? Since IRC permits users to create their own discussion rooms and decide who will be admitted to them, what prevents a hate group, for example, from creating a discussion room where many individuals are admitted? Will the Criminal Code apply in such an instance? What is the difference between setting up a web page which disseminates hate literature which is accessed by 100 users per week and setting up a private chat room where 100 users meet weekly? The source control model would require the monitoring of such discussions in order to ensure that such information is not being passed on. Presumptively, it would be necessary to monitor all such conversations, even benign, legitimate conversations, to ensure that they do not degenerate into discussions which are hate mongering or pornographic in nature.

Electronic mail also permits easy and efficient propagation of ideas. E-mail addresses are no more secret than the street addresses found in phone books. The difference is that the average user can set up a mass mailing of materials with much greater ease and much less expense than contracting with Canada Post to distribute bulk mail and the printing of large quantities of materials. An average user has the capacity, with most E-mail programs, to distribute E-mail to lists of individuals.45 Such distribution has been done in the past. E-mail junk mail is becoming increasingly more common. For example, I have received unsolicited E-mail from advertisers and other individuals seeking a wider audience. Again, to ensure that such material is not distributed, would require constant monitoring of E-mail transmissions.

Monitoring of all traffic on the Internet, from the source, would be a clear invasion of traditional concepts of privacy and cannot escape comparisons to George Orwell's Big Brother. Individuals would feel their expression hampered, even in the discussion of the most mundane and even sincere topics, with the knowledge that someone may be watching and may decide that the content of their speech is unacceptable. No doubt, there would be a great risk of capricious and arbitrary application of the rules. The risk of this has been demonstrated in respect to the conventional media of printed books and magazines. In the Little Sisters46 case the British Columbia Court of Appeal held that customs officials had exercised their authority to prevent the importation of obscene materials in an arbitrary manner by singling out materials destined for the Little Sisters Book store in Vancouver. Although the court upheld sections of the Customs Act and the Criminal Code as being a reasonable infringement upon s. 2(b), the court found the exercise of discretion under these acts by Canada Customs was arbitrary.47 The concern about arbitrary application of the rules with respect to monitoring of Internet communication is, therefore, not unfounded.

The source control model also poses another significant practical problem. The Internet is an international network. Users have the ability to access information from all over the world without regard to international boundaries. In fact, it is often difficult for the average user, on the basis of an address, to even know where a particular site is located without making reference to an Internet directory.48 Canada, for example, would have jurisdiction to control sources originating within its territory. It could not, however, control sources outside of its territorial jurisdiction and other countries may have different standards with respect to what materials they find unacceptable. A country with a long history of censorship such as China or Singapore may take a far more restrictive view of what materials are made available on its sites.49 Simultaneously, countries like the Netherlands and Denmark which already take a very open and unrestrictive view of pornography may be far more permissive in terms of what materials they make available. The result will be inconsistency on a grand scale. Although, Canadians may be at risk of penal sanction or censorship if they post materials deemed offensive by Canadian authorities, they may still be able to access such information from foreign sources with impunity. This fact has not gone unnoticed by Internet regulators and leads us to the next model of regulation, the customs/border control model.

(ii) The Customs/Border Control Model

The custom/border control model would likely have to be used in conjunction with another model, such as the source control model. Much like the Customs Act50 and Customs Tariff51 grants discretion to customs officers to seize obscene material being imported into Canada, the Criminal Code52 proscribes penalties for the possession, distribution and production of obscene material and child pornography within Canada. These sections work to both block the importation of material into the country and to stop it at its source where it originates within Canada. A similar regime would be required for the Internet. The problems of attempting such regulation domestically was examined with respect to the source regulation model. Here, the problems associated with controlling material flowing into the country from outside of it are examined.

Some of the practical problems have already been considered with respect to the international nature of the internet. Canada would have to monitor the flow of all information entering the country. The sheer volume of such material coupled with the real time aspect of it would required massive, continuous and instantaneous monitoring of all communication. Unlike books or magazines which can be stopped at the border and examined for content, bits and bytes cannot be so easily stopped midstream and analyzed before being permitted to go on their way. When a user decides to access a particular web site, the expectation is immediate access, not access a few days later when customs has had a chance to clear it. For every Internet user in Canada online at any one time there would have to be someone monitoring the content of the materials being accessed.

Computers could be employed to undertake this role. This has been the tactic taken by Germany but with little success.53 Every Internet site has a unique address. Computers can be programmed to block certain addresses from access. Therefore, computers known to contain objectionable materials can be blocked. However, there is a very simple work around which was demonstrated in the above noted German example. Germany wished to block access to materials placed on the Internet by Ernst Zundel. He had anti-Semitic materials located on an computer in Santa Cruz, California (www.webcom.com) which Deutsche Telekom blocked access to. Therefore, any users in Germany using the Deutsche Telekom network were unable to access Zundel's materials. It did not take long, however, for mirror sites54 to be created. The German authorities are left playing a game of catchup--blocking new mirror sites as they crop up around the world. Inevitably, the promoters of such ideas will always be one step ahead of the authorities as the authorities will have to find the mirror site after the information has been made available.

Knowledgeable computer users may also be able to circumvent attempts at blocking particular sites. These methods include manually routing through other networks to reach the site, or making a simple long distance telephone to an ISP outside of the country.55 Another method are relay sites such as the "Canadianizer."56 These sites operate by fetching and displaying the material at another site, but they do not report the address of that site to the user's computer. Instead only the address of the relay site is reported. Users in Germany, therefore, can connect to the "Canadianizer" and view Zundel's material without the German network knowing that information from Zundel's site is being viewed.57

Despite the obvious concerns about freedom of expression which this type of censorship raises i.e. who determines which material is blocked and based on what standard, there are some significant concerns about the breadth of material which may become unavailable to users where the German style blocking is employed. When a particular site is blocked, all material on that site becomes unavailable, without regard to content. Therefore, if Zundel and the Baha'i Faith both used the same web site, information made available by both would be unavailable. This is because, in order to effectively block access to information on that site, the whole site would need to be blocked. Merely blocking access to specific files would make it easy for someone, like Zundel, to give new names to the files containing his information, subverting the block. This approach, therefore, would give rise to the risk of over-breadth in its application. The infringement of freedom of expression extends not only to obscene and offensive material, but to material which is not obscene or offensive, and even potentially valuable.

There are other means which can be used by a state to control access to information, but the risk of over inclusive application still exists. For example, a state could use computers to monitor the flow of information looking for the use of certain words such as "sex", "sado-masochism", and "Nazi". But, unlike human beings, computers are incapable of distinguishing between the use of the word "sex" to discuss the "asexual reproduction" of certain plants, and the use of the word "sex" to describe activities between people. The result would be the wholesale blocking of reference to anything having to do with "sex" or anything having to do with History of Nazi Germany and the second World War.

With electronic images this problem is even more acute. Electronic images can be identified by their file formats. It is possible for a computer to recognize a certain file as being an image. However, there is no way for a computer to identify an image as being pornographic or obscene. Again, states would have to use people to monitor all such images, in real time, to determine whether they are acceptable for viewing. With such monitoring comes the risk of the capricious and arbitrary exercise of discretion by those monitoring the materials, as noted in the Little Sisters58 case.

As has been noted previously, the Internet was designed so that the failure of any one part of the network does not disable the entire network and, as with a network of highways, there is often more than one pathway from point A to point B. This is true in most cases. However, certain states, such as Singapore, have only one link or line connecting their country's network with the Internet.59 In Singapore, all computer data reaches the local network through Singapore Telecom, therefore, it is somewhat easier for that state to institute measures aimed at curbing the availability of offensive material. In a country such as Canada, where there are multiple links to the Internet and a phone call to the US is relatively inexpensive, massive governmental control and monitoring of all telecommunications would be necessary in order to monitor the flow of electronic information across our borders. This would involve monitoring of communications now not subject to such control, such as telephone communications, in order to catch users attempting to circumvent Canadian restrictions by using foreign ISPs.

The concern for proponents of freedom of expression is how blocking technology is used. Canadians would primarily be concerned with their access to pornography or hate speech being limited. Citizens of less democratized countries, however, have to be more concerned with limitations on access to information which far more deeply impinges upon the core values of freedom of expression--political expression and personal self-fulfilment. Vietnam and Burma are two countries concerned about the campaigning activities of exiled dissidents on the Internet. Vietnam has gotten around this by limiting the state owned Internet provider to granting subscribers access to E-mail, only and not the Web.60 Singapore is concerned that the "influx of objectionable materials via the new electronic media, if left unchecked, will undermine our values and traditions."61 These examples of attempts to limit the political expression and discussion, demonstrate how government regulation of the Internet can be used for the purpose of suppressing political opposition.

Because the Internet is a network, information must pass through other computers en route from point A to point B. Much like driving from Nova Scotia to Manitoba would require passing through a number of states or provinces, information may pass through a number of countries en route to its destination.62 States would either have to explicitly exclude monitoring messages in transit through its jurisdiction, or the result would be the imposition of the values of the intermediate state on information destined for consumers not within that state. The effect on the individual is that they become powerless because they are unable to politically influence the values of the intermediate state. Their freedom of expression is infringed upon without any legal or political recourse.

A consequence of these models, as has already become evident, is that they cannot be used in isolation. Use of the customs/border control model is aimed at preventing offensive material from entering the country, but must be used in conjunction with another means to control the dissemination of information that originates within a country. The source control model has already been examined. Another possible method is the provider control model.

(iii) The Provider Control Model

The provider control model operates on the principle that information which originates both inside the country, and outside the country, can best be controlled by placing the onus on those companies and individuals which provide Internet services, i.e. the ISP, to control the information obtained by their customers. For example, Dalhousie University would be held responsible for all materials which it makes available to its staff and students through the Internet. This approach is similar to holding video store operators responsible for the movies that they rent or sell to customers, as was the case in Butler.63 Theoretically, there is nothing preventing use of the same Criminal Code provisions against ISPs and, in fact, they have been applied in Canada to operators of Bulletin Board Systems64 or BBSes.65 In Winnipeg in May of 1993 police raided eight BBSes and charged them with distribution and possession of obscene materials. Similar raids occurred in Toronto in October of 1993.66 One of the most widely publicized crackdowns on BBSes took place in the United States. In the Amateur Action case, the operator of a BBS who made available, on a subscription basis, pornographic materials was charged with the distribution of obscene materials. What is interesting about this case, is that the materials were store on a computer in California and downloaded by users in Tennessee.67

One of the characteristics common to the Internet and, BBSes, is the ability of users to both retrieve and store information. BBSes get their files from other users, just as the Internet relies on its users to make information available. The first problem that any provider must consider is where the information is located. People have many misconceptions about the Internet and where the information is actually stored. What does the phrase "on the Internet" mean? People often use this phrase when they are going to read their E-mail. They say "I will read my E-mail on the Internet."68 In fact, the user is not "on the Internet". The user is actually reading their E-mail on a computer which is connected to the Internet. For example, when someone leaves a message on a telephone answering machine we do not say that the message is "on the telephone" as it is stored on a tape on the answering machine. The computer connected to the Internet is really no different than the answering machine, it merely communicates with a network and provides a means of storage. The E-mail itself, is sent to that computer and placed on its hard drive in space set aside for that particular user. The user can then retrieve it at his or her convenience. This is the case with most forms of Internet interaction. USENET news works on a similar principle. The messages, although distributed via the Internet, are in fact stored on the ISP's computer where its users retrieve the messages. The Web, however, works somewhat differently. The information is stored on the remote computer and the user retrieves that information. The user's computer may store it temporarily on its own hard disk, or keep it in memory for a short period of time, however the source of the information, in its truest sense, is on the remote computer. The Web is the closest thing to being "on the Internet" as one is constantly using the Internet, in real time, to access information.69

By placing the onus on providers to control access to information, providers are being held responsible for all data that flows through their system to their users. The most significant problem is that of the sheer enormity of the information available. Take for example, the 7000+ newsgroups available through Dalhousie University. It is inconceivable for Dalhousie to monitor each and every message on each and every newsgroup for content. This has not stopped some providers, including a number of universities, from at least making attempts to control the content of USENET. As noted above, the actual message are transmitted through the Internet to the provider, in this case Dalhousie University, where they are stored locally on the university's hard disk space on its newserver computer.70 Therefore, providers can easily control what newsgroups they have access to.

The German government convinced CompuServe, one of the larger ISPs with an international presence, to take this very route in late December of 1995.71 German officials contacted CompuServe concerned that certain sexual materials were being distributed by CompuServe over its networks. In response CompuServe blocked access by its users to any USENET groups containing the words "sex," erotic," or "gay" in their names. The impact of this, however, was felt worldwide by CompuServe customers as CompuServe was unable to distinguish between its German users and its users in the rest of the world. In addition, it affected some relatively benign, and even positive newsgroups, such as alt.sex.safe, which discusses issues surrounding safe sex, and soc.support.youth.gay-lesbian-bi, a group dedicated to supporting the emotional concerns of gay and lesbian youth.72 CompuServe eventually reversed its decision preferring to leave it up to individual users to decide what materials they wished to access.73

A number of Canadian universities decided to take the same approach after complaints from students, the media, and interest groups, with varying degrees of success. In Canada, three responses were taken by universities. Some decided to prevent access to certain USENET newsgroups, the alt.sex groups, in particular while others refused to cut the newsfeed and resisted preventing access. Many of those who initially cut access, subsequently restored it following study and the drafting of policy to deal with these issues.74 The University of British Columbia took this third approach, coming up with an acceptable use policy. The university concluded that it "should not ban the electronic communication between willing participants of messages and images which others might find offensive, since no such ban applies to other forms of communication."75 The University of Manitoba also banned all alt. newsgroups in May 1992 after complaints were received.76 One of the incidental effects was that Brandon University, which received its newsfeed through the University of Manitoba, was also unable to receive those newsgroups as a consequence of the University of Manitoba ban.77 It has only been in the past year or two that alt. newsgroups have again become available through Brandon University on a selective basis.

What these examples demonstrate is an attempt by providers to take responsibility for the content of the material available on their systems. However, these attempts have been perceived more as attacks on freedom of expression. This is because, the bans are not very effective and are arbitrary in application with little regard to the actual content of the materials banned. The University of Manitoba ban on all alt. groups, is a prime example. What the bans tend to ignore is the simple fact that the majority of the images and materials available on the newsgroups, although offensive to some individuals, are not offensive in the legal sense. Many of the images are electronically scanned from magazines available at the local newsstand.78 Therefore, much of the material being censored is otherwise legal in nature. This is not to deny that illegal material is available on the Internet. The problem again becomes one of assessing whether or not an image is, or is not, legally obscene.79 Computer are incapable of making this assessment and it must be left to human beings. Blanket prohibitions against providing certain newsgroups, it can be argued, do more to inhibit otherwise legal expression, that they do to prevent the distribution of illegal expression.

From a technical standpoint, such bans are also somewhat superficial. All providers which provide access to USENET, are themselves, recipients of the "newsfeed" from elsewhere. The determination to ban a particular set of groups is made by that particular provider or, the provider of the newsfeed. Therefore, users can simply circumvent the ban by going further down to line to find another source of USENET news. In other words, rather than relying on Dalhousie University's news server, a Dalhousie user only has to look to another server connected somewhere on the Internet which is willing to provide the USENET service. In fact there are servers on the Internet designed with the purpose of circumventing such bans who make their newsfeeds available for a small fee.80

Where the provider control model has the biggest challenge is with the control of access to the World Wide Web. Vietnam has dealt with this by instructing the state owned ISP not to offer World Wide Web access.81However, in North America and most western countries an ISP would simply not survive in the marketplace if it did not offer Web access. Therefore, the only way by which an ISP could control access to offensive information over the Web would be to block out certain sites, or undertake monitoring on a massive scale. The problems of blocking out sites have already been explored under discussion of the custom/border control model and monitoring would be no easier for an ISP which is usually attempting to provide a low cost service and has limited resources at hand; not to mention, the volume of information on the Internet makes such an approach prohibitive.82

Probably the single largest problem with the provider model, is the inability of a provider, without monitoring on a massive scale, to know precisely what materials are made available. The provider model places on the provider, responsibility for material it had no hand in creating. The quantity of information available on the Internet is vast. It is true that providers could pool their resources and certain sites would become well known and therefore, prime targets for censorship (Ernst Zundel's being but one example). However, few can really comprehend the degree of information that would have to be monitored. Each and every E-mail message would have to be scrutinized, as would each and every file transferred by a user. Providers would find themselves at risk of being held responsible for the distribution of material, the origin of which they are not responsible for. Why should the provider be held responsible when it lets a pornographic image slip through its net which originated in the Netherlands? What differentiates an ISP from conventional service providers is their inability, inherent in the technology, to know what materials they are distributing and what materials are being stored on their system by their users. A video store owner, for example, knows which videos are stocked on its shelves because he orders them. Whereas the ISP can have no control over what materials its service ultimately makes available to its customers because it is its customers, and the users of the Internet, which determine the nature of the content available on it..

The impact of the provider control model on freedom of expression is interesting. The significant impact would be that it would place a huge amount of discretion in the hands of the provider. Each provider would ultimately be held responsible for the content of the materials it makes available. A number of things could happen. Some providers would undoubtedly take an overly cautious approach, erring on the side of censorship to avoid the risk of prosecution. Other providers would then capitalize on the cautious approach of some providers, and take a far more liberal approach. Users would then "provider shop" taking into account the degree of censorship practised by each provider. Those providers who take a more pro-censorship approach, are also likely to be the more expensive providers due to the cost and resources required to carefully scrutinize E-mail, web access, etc. Whereas the more liberal providers will likely to be able to offer a cheaper service, although at greater risk of penal sanction to themselves. This model may also give rise to a market for clandestine providers--individuals who provide Internet access on the black market with the promise of no censorship.

This is ironic, as what would be created would be a market place in freedom of expression. From a liberal, laissez-faire analysis this would be a reasonable and rational outcome of such a system. However, from the standpoint of freedom of expression, it would lead to inconsistency and the risk of arbitrary and capricious limitations being placed upon expression by providers. As has already been seen in the reaction by some Canadian universities, the effect of censorship has been to censor more than the illegal material on the Internet, but to extend that censorship to materials which some individuals find merely offensive.

(iv) A Brief Summary

At this point it may be helpful to briefly summarize the three models, their problems, and the impact each has upon freedom of expression. The source control model attempts to stop the distribution of offensive information at its source. It operates by limiting what information can be placed on the Internet by the users of the Internet. It would do this by monitoring what materials are made available and by penalizing those individuals or groups which place offensive material on the Web, post it in USENET or distribute it through electronic mail. This system is, from a practical standpoint unwieldy in the sense that it would require massive monitoring of the Internet by government. Furthermore, the source control model alone doesn't address the problem of information made available at sources outside of the country which is accessible domestically. The source control model is essentially based on traditional, territorial concepts of regulation. It looks to control what happens within its territorial jurisdiction and ignores the fact that the Internet is an international network oblivious to political boundaries.

From a freedom of expression standpoint, the source control model would greatly infringe upon one's expression be leaving open to state scrutiny the content of the ideas expressed by users of the Internet. It would permit the state a broad range of control over what the user can, and cannot say on the Internet. This would give rise to a risk of the arbitrary and capricious exercise of discretion by those charged with monitoring the content.

The second, custom/border control model could work in conjunction with the source control model. It is premised on the idea of stopping offensive material, at the border, before it enters the country. This again, would be accomplished through the blocking of particular sites and the monitoring of electronic mail, USENET postings, and also IRC chat conversations. The practical considerations include the impossible task of blocking sites in such a manner as to stay one step ahead of the propagators of the offensive material as the German experience with Ernst Zundel attests to. Again, massive monitoring would be required which could lead to the arbitrary and capricious exercise of discretion, much like the Little Sisters situation. The state, by deciding which sites were permissible for viewing and which sites were not, would be in a position to influence expression and silence the spreading of ideas that it deemed unacceptable. Singapore and Vietnam being examples of this possibility. Furthermore, because the flow of communication passes through other states en route, there is the risk that the attitudes of one state may be imposed upon the users of another by censoring the data in transit.

Placing control in the hands of the provider is premised on the concept of making the Internet provider responsible for the materials it distributes. As pointed out, absent monitoring on a massive scale, the provider is being held responsible for the content of materials it had no hand in making. Unlike the video store manager who knows the contents of his store shelves, the ISP has no control over what materials are available on the Internet, but would have ultimate control over what materials its customers has access to. The result would be the placing of enormous amounts of discretion in the hands of the ISP with the effect that disparity would result in the application of that discretion. Some ISPs would be more diligent and cautious, censoring more materials. Whereas other ISPs would take a more liberal approach. Users would begin to judge their ISPs according to the degree they censor materials. There may also be the emergence of a black market in Internet services, fuelled by the desire of some users for censorship free Internet access.

What is significant about all three models is their attempt to impose a gatekeeper in an open access model of communication. What the above criticisms demonstrate is the practical problems which are encountered in imposing an artificial element, the gatekeeper, into a system where the absence of a gatekeeper is a fundamental characteristic of its design. The imposition of a gatekeeper in an open access model of communication, requires not one single gatekeeper, but multiple gatekeepers, one for each user that monitors and censors the content of each user's expression. This leads us to what, I will argue, are the only reasonable means of controlling content on the Internet. My view looks to the open access structure and recognizes that the best gatekeepers are the individual users themselves who decide, as individuals or as a collective, which information should be available on the Internet. Only through accommodating the fundamental characteristics of the open access model can we utilize it to its full advantage.

(v) The Self Regulatory Model

The self regulatory model has actual demonstrated effectiveness on the Internet. It is based on mutual respect between users and on the principle that, as a community in its own right, it is the users of the Internet who should define the rules.

One of the most interesting phenomenon on the Internet is the evolution of what has become known as "netiquette". Netiquette is based on the principle that just as there are rules of behaviour for physical interaction, there should be rules of behaviour for virtual or cyber interaction. These rules have evolved over time, and include basic rules such as do not use ALL CAPS IN AN E-mail MESSAGE as it is hard to read and is interpreted as shouting,83 to don't post pornographic pictures to the K12 newsgroups (the K12 newsgroups are newsgroups aimed at school age children.)84 Along with these rules have come sanctions for breaching them. These include flaming--criticism of comments made by another user both through E-mail and postings to USENET. Some people see flaming as more of an art and as a creative yet, constructive means of criticizing comments made by another user. Unfortunately, more frequently than not, flaming has become synonymous with vicious written assaults on the character of other users.85 Another offence seen on the Internet is that of spamming--the posting of the same message, (usually an advertisement) to multiple newsgroups otherwise known as cross-posting.86 The most well known example of spamming took place when two lawyers from Phoenix posted an advertisement for their legal service to thousands of USENET newsgroups. Not only did they violate the rule against multiple postings and against posting to groups where the topic of the message was irrelevant, they violated the rule (which is quickly crumbling) against commercial use of the Internet. The response was an example vigilantism at its worst. Thousands of users E-mailed the two lawyers, filling their mailbox with often long messages filled with gibberish. The result was the an overloading of their system and a decision by their ISP to terminate their access.87 This response is now typical and demonstrates that even on the Internet, there are means of punishing offenders of the rules.

Realistically, the greatest punishment that users can met out, is ostracization and complaints to the ISP system administrator. When complaints are great enough, the offending user may have their access terminated. Nothing prevents that user from seeking access elsewhere, however.

On the surface, this seems very benign. The rules are established through evolution, much like the common law is developed. Punishment, however, is not administered by a judicial structure and this does pose problems for this model, especially with respect to freedom of expression issues. The Internet, as a method of communication, is by definition all about expression and many of the rules of the Internet reflect that fact. They govern how users are to express themselves and where they are to express themselves. By and large, the system tends to operate quite well. Provided that individuals stay within the rules, all ideas are fair game on the Internet. There are no rules regulating what can be said, provided it is said in the appropriate forum. For example, discussion of holocaust denial is quite acceptable in alt.revisionism but would be completely unacceptable in rec.photo.techniques. This is because, there is an appropriate place for all subjects.

The bonus for freedom of expression is that this model encourages discussion and debate, one of the cornerstones of Mill's approach to freedom of expression.88 USENET news for example, provides a home for both the Zundels and the Martin Luther Kings of the world, often, in the same newsgroup. A prime example is that of Ken McVay, a resident of British Columbia who spends his time on the Internet confronting and debunking racist and anti-Semitic posters and postings on the Internet. Through the use of well researched and clearly expressed arguments, McVay takes on the hatemongers and exposes the falsity of their arguments.89 As Gareth Sansom states in his background paper, "Illegal and Offensive Content on the Information Highway":

[There is] a crucial difference between hate-promoting pamphlets or telephone answering machines with hate message and USENET newsgroups. If a white supremacist group leaves pamphlets on car windshields or on benches in a public place, an unsuspecting individual who reads the pamphlet is presented with a one-sided diatribe. In USENET groups such as alt.revisionism or alt.skinhead, every time an anti-Semitic or racist message is posted, people like McVay. . . post rational and well-researched counter-arguments. The presentation of multiple viewpoints ensures that a discussion group can never degenerate into a hotbed of hate propaganda.90

The result is a healthier environment where no statements go unanswered.

The same is not entirely true when it comes to the web. Any organization can put up a web page which presents a one-sided view of an issue and there is no obligation upon the creator of the page to present opposing points of view. However, all users of the Internet possess this ability, so both Zundel and McVay have the ability to make available on the Internet a web page dedicated to their own views.

There is a dark side to the self-regulatory model. Since the means of punishment lie in the hands of every user, there is nothing preventing individual or group acts of "Internet Terrorism" aimed at particular individuals or groups. Just as the two Phoenix lawyers were chased off the Internet through the indiscriminate E-mailing, so too can people like McVay be attacked by the people who oppose them. This is an unfortunate, but important by product of the self-regulatory model, as it too can be used to stifle rather than promote the values of freedom of expression.

New technology has only served to increase the risk of Internet terrorism. This technology takes the form of "bots" which are essentially software incarnations of mechanical robots designed to do automatically what humans would have to do for themselves.91 Bots are essentially programs which perfom certain functions automatically in response to changes in its environment. Examples of bots are listservs which redistribute to subscribers, E-mail messages sent to them. Bots have been used on IRC channels to emulate human beings. They are programmed to respond to certain words or phrases addressed to them with pre-programmed responses. A special type of bot, a "Know-bot" is used to filter out USENET postings according to subject or author so that a user need not wade through all messages and concentrate on those areas in his areas of interest. They can also be used by a user to self-censor messages, leaving for the user only those messages not likely to be offensive.92

However, this censoring ability of bots can, and has been used by users to limit the expression of other users.93 Bots can be programmed to sent E-mail to users to express views which the programmer finds offensive. On the one hand they could be used by individuals like McVay to target hate mongers. On the other hand, they could be used by racists and bigots, themselves, to attack and harass users wishing to discuss issues surrounding homosexuality, for example. In addition to sending E-mail, bots can be programmed to cancel USENET postings which the bot's creator finds offensive, essentially denying the poster a forum.94 This ability is very powerful and could be used by those favouring limitations on expression to their ends. In the summer of 1995, for example, the bot, "CancelBunny" began cancelling postings in the alt.religion.scientology newsgroup, which contained materials that the Church of Scientology deemed to be copyrighted. Another user, recognizing that the bot was being used in this manner and acting in the true spirit of the self-regulatory model, created the Lazarus bot which had the ability to detect when a message had been cancelled and notify the writer, automatically so they could re-post it.95 Since bots are a relatively new phenomenon it will take some time for technology to catch up with them and find a way to neutralize them. In the meantime, they pose a risk to freedom of expression and demonstrate how the self-regulatory model, can be in the short term anyway, in effective in dealing with what amounts to Internet terrorism and also conducive to this type of behaviour.

(vi) The End-User Model

The lowest common denominator in an open access model of communication is the individual user. The end-user model focuses on the user as being best placed for controlling the information he or she accesses. In other words, the user determines what materials he or she will view or consume. This view is considered most compatible with freedom of expression as there is no gatekeeper to decide which ideas will be distributed and which will be suppressed.

Regulation of the Internet, as it currently stands, is essentially a reflection of the end-user model, and has given rise to much criticism. Critics argue that the end-user model does little to ensure that obscene and offensive information is not viewed by those members of society who are vulnerable to it. Given the current state of technology, this concern is not unfounded. The end-user model, in its purest form relies completely on the individual user. It is founded on the assumption that all users are equally capable of determining, for themselves, which information they shall consume. Unfortunately, this assumption does not reflect reality.

The greatest concerns have been expressed with respect to what materials children may be exposed to on the Internet. On most Internet services, USENET newsgroups containing stories describing sexual fantasies are easily and readily available to anyone with access, including children. Images which are also available on USENET are not in a format which is readily available for viewing by the user.96 The web, poses one of the biggest concerns for parents as images are readily made viewable by web surfing software. Therefore, there is a legitimate risk that a child surfing the web may stumble across pornographic images. In reality though, the chances of this happening accidentally, or small. One of the benefits of the capitalist marketplace is that pornographers want their cut. Increasingly, individuals who make pornographic materials available on the Internet are doing so for paying customers only. They are requiring that users provide a credit card number and, as a minimum, register with them, providing a name, address, and often, some proof of age (such as mailing in a photocopy of a drivers license).

But these methods are far from foolproof. Inquiring young minds can quickly figure out how to decode the images on USENET and some youngsters even have access to credit card numbers or phony identification. As of yet not all individuals making pornographic materials available are in it for money and, in some cases, the security measures on some pornographic sites consist of nothing more than the requirement that the user click on the "Yes" button when asked the question "Are you 18 years of age or older?"

There is a more sinister problem on the Internet--Internet Relay Chat. IRC has been used by some paedophiles as a meeting ground. There have been cases of paedophiles meeting children through the Internet and then arranging to meet them with the purpose of sexually abusing them.97 The problem then becomes, how do we protect children from other users of the Internet? The end-user model, unlike other more interventionist gatekeeper models, cannot, by itself prevent communications that lead to this situation.

Children are not the only victims. Hate speech still exists on the Internet and can harm children and adults alike. However, as with the self-regulatory model, the end-user model at least fosters an environment of open discussion on the Internet where the promoters of hate speech can be confronted and rebutted. It levels the playing field and ensures equal access to the resources the Internet provides.

This is an aspect of the Internet that subscribers to the end-user model jealously guard. As a consequence they have been very resistant to any attempts at government regulation of cyberspace. The recent enactment of the Communications Decency Act of 1996, incorporated in the omnibus bill, the Telecommunications Act of 1996,98 by the United States Congress has been strenuously opposed by many Internet users and civil rights groups in the United States. They are particularly concerned with s. 502, which makes it a crime for anyone (including an Internet user) to knowingly display or distribute to anyone under the age of 18 indecent or obscene material. Interest groups such as the ACLU and the Citizens Internet Empowerment Coalition (representing the ISPs American Online and CompuServe as well as HotWired Magazine and Apple Computer, among others)99 have joined in a lawsuit aimed at challenging the constitutionality of this bill.100 Supporters of the suit are concerned that:

. . . the Act is unconstitutional on its face and as applied because it criminalizes expression that is protected by the First Amendment; it is also impermissibly overbroad and vague; and it is not the least restrictive means of accomplishing any compelling governmental purpose.101

They are also concerned that the Act will inhibit development of the Internet as a forum for the free exchange of ideas and information.102

In addition to lawsuits, Internet users have banded together to create the Blue Ribbon Campaign with the common goal of promoting freedom of expression on the Internet. Those users who believe that the Internet should be a free of constraints are freedom of expression are asked to place a photo of a blue ribbon on their web site and to wear a blue ribbon as a sign of support.103 The supporters are adherents to the end-user model in the sense that the believe it is the individual's responsibility to individuals and parents to assist themselves and their children in accessing the vast resources the Internet offers, free from government censorship. It is difficult, however, to label supporters of the Blue Ribbon campaign as libertarians, as strict adherence to the end-user model may suggest. Many supporters (such as those who support the suit against the Telecommunications Bill)104 acknowledge the responsibility of parents towards their children in educating them about the Internet. In that respect they may be better defined as adherents to the modified end-use model.

(vii) The Modified End-User Model

The modified end-user model attempts to address the risk of victimization that the end-user model does not protect against. It recognizes the need for some means of control so that those who are unable to protect themselves, are protected. At the same time, it attempts to preserve the freedoms that adults presently enjoy under the end-user model by placing responsibility for content on the end-user.

First and foremost, the modified end-user model recognizes the need for parental involvement and parental responsibility in the monitoring of what their children access on the Internet. Parents, therefore have a responsibility to educate their children about the Internet, about the material available on it, and about who uses it. Just as parents must "street-proof" their children, they must also "Internet-proof" their children.105 The modified end-user model is premised on the notion that parents must guide and educate their children about pornography, violence, racism, and prejudice so they are capable of dealing with these issues both as children, and as adults.106 The Internet, after all, is a reflection of the society in which its users exist. As more and more people become users of the Internet, it will increasingly reflect the diversity which comes with diversity of users. Freedom of expression is important to promoting that diversity. However, with that diversity comes the price of being a member of a society which contains not only positive ideas but negative ideas. Parents must note only prepare their children to be knowledgeable, active, productive citizens, but knowledgeable, active and productive "netizens."

The end-user model recognizes, however, that mere parental participation in their child's Internet experience is not enough. Therefore, this model looks to ways in which the technology can be used by parents in the education and protection of their children. Specifically, this model advocates the use monitoring software such as Net Nanny107 and SurfWatch108 which screen the content of Internet materials flowing into and out of, the home or school computer. This software is designed to block sites containing certain words such as "sex" and to block specific sites known to contain pornographic or violent material. However, this software is subject to the same limitations of any gatekeeping regime i.e. it is context neutral and it incapable of keeping up with the creation of new sites without regular updating of the software. The distinction is that, combined with parental education and technological control, children learn to develop the skills needed to decide from themselves what information is objectionable, as opposed to the state making these decisions for them.

These programs also have the ability to log what sites the user has visited and to block newsgroups. Some even monitor E-mail and keep track of who the user is communicating with. This is an especially useful feature where parents are concerned that their children are being stalked through E-mail by a paedophile. However, the other advantage of these programs is that when an adult wishes to access the Internet, they can do so by disabling the program. As a consequence, freedom of expression for adults is preserved.

This is the approach that CompuServe eventually took in response to German concerns about the newsgroups it offered.109 CompuServe has made the screening software, Cyber Patrol available to its users, free of charge. Cyber Patrol, like Net Nanny, permits users to selectively block certain sites and screen incoming material.110 In doing so, CompuServe is demonstrating how the modified end-user model is gaining acceptance and is, for most users, preferable to more drastic methods such as the provider control model.

Ideally, the modified end-user model would also incorporate a voluntary, or even compulsory, rating regime for web sites, much like the rating of movies according to the degree of sex, violence and profanity they contain. Through the use of a rating system, Internet software or software such as Net Nanny and SurfWatch would be able to automatically inform a user that a particular site contains material which is suitable for a particular audience. The software could be designed to automatically block objectionable sites. The problem, however, is one of compliance. A compulsory system would require some sort of additional gatekeeper function and someone in a position of determining the rating each site was to receive. With that comes a risk of arbitrariness. However, the difference is that, at worse, a site would get a restricted or X rating, as opposed to being banned, altogether. There would be significant enforcement problems as again, the number of such web sites is enormous and the substantial number of resources required probably prohibitive. There would also be the problem of getting international adherence to the same set of standards.

Most Internet users would likely prefer to see a voluntary system. Software could be designed to immediately block sites which are left unrated by their maintainers. However, there would be the problem of accurate and consistent rating. Everyone's concept of what is offensive is different. Ideally, all web sites containing pornography would be given the most restrictive rating. It is possible many pornographers would even subscribe to such a voluntary code as their target audience is adults as they have the money. However, it is unlikely that the spreaders of racist and anti-Semitic materials would be so willing to rate their sites in a manner which would restrict access. As a consequence, other means of preventing access would have to be utilized.

The clear advantage to the modified end-user model is that it minimally infringes freedom of expression by still leaving the ultimate decision as to what information is consumed by the user, with the user. It does not suffer from the same concerns about monitoring vast quantities of information in a consistent, not overly-inclusive, and fair matter. Rather than introducing a gatekeeper for each user, each user is their own gatekeeper. The user is permitted to make informed choices and to utilize technology to decide for herself, or for her children, what materials will be accessed. At the same time it preserves for all groups and individuals, the ability to utilize the Internet as an expressive outlet without regard to the content of their message. It preserves the ability of the Internet to act as a forum of debate for all users, in the spirit of John Stuart Mill. The position of Feminists for Freedom of Expression is that:

The best protection for women's ideas and voices is complete constitutional protection of free speech. Historically, censorship in the name of 'decency' has hurt women by restricting access to information about reproduction and sexuality. It has never reduced sexism and violence. Previous centuries have seen much more censorship than we have today and yet much more discrimination against women. The best counter to speech some women may find offensive is not restriction, but adding more women's voices to the mix.111

In support of this position, the FFE notes that in 1994, one ISP closed a women's discussion group because of ideas being expressed in that group.112 And, as Sansom notes:

If an entire newsgroup were to be censored, it would stifle the marshalling of opinions, evidence and arguments which counter inflammatory material. Messages from people such as . . . McVay . . . may sway some individuals from racist beliefs. More importantly, their public availability in newsgroups such as alt.skinhead provides others with the tools to fight prejudice. The very appearance of such postings demonstrates that we are living in a tolerant, democratic society and thereby repudiates the lies of bigotry.113

How does the modified end-user model mesh with existing criminal regimes aimed at stopping obscenity and promotion of hate? It would be unrealistic and, untenable, to attempt to elevate the Internet beyond the jurisdictional control a state's criminal system. The result would be a double standard with materials illegal when in written or photographic, legal when placed on the Internet. It is important to keep in mind that before a state can prosecute for a given crime, it must have jurisdiction over a person located within its territory. Internet users are still present within a territory and therefore are still subject to the jurisdiction of the state in which they live. Essentially, the end-user model, by making users responsible not only for determining what it is they view, but for what they distribute, recognizes the jurisdiction of a state to impose penal sanction on users who possess, and distribute, such materials from a point within their jurisdiction. This seems like the control model but it is distinguished on the basis that the Criminal Law is reactive. It operates only after a complaint has been filed and a crime has been committed. The models of regulatory control analyzed thus far are all pro-active in the sense that their goal is to actively prevent the distribution of such material. But the Criminal Law, simultaneously, has to recognize that it cannot hold providers responsible for materials which, although technically available to their users, did not originate with either their users or with that provider. Individual users found in possession of such materials within their jurisdiction would be fair game under this regime. The criminal law has to recognize that materials from foreign sources not in possession of domestic users but accessible to them, is a problem for the state with jurisdiction i.e. the foreign state where the material originates. Furthermore, attacking users for the distribution of obscene materials to minors, as the American Communications Decency Act does, place at risk ISPs who may not be aware that they are distributing obscene materials. It is preferable to prosecute the end-user found in actual possession of obscene materials as this is consistent with the concept of making users responsible for the actions.

It should also be noted that the end-user model and the self-regulation model are entirely compatible. By establishing norms for the Internet, the self-regulation model provides a framework of behaviour for the end user which assists the user in taking full responsibility for his or her actions in cyberspace.

7. Conclusion

The source, provider and custom/border models of Internet control only serve to put up roadblocks on the information highway. They impose value judgements on the content of expression and divide and isolate people on the basis of how they think. These models increase the risk of the arbitrary and capricious exercise of discretion. They apply standards which are, in and of themselves, vague. The control, custom/border and provider models all are subject to the common criticism that suppression of expression and ideas, in the long run, does little to strike offensive ideas from the mind of humanity. They impose gatekeepers, a characteristic of the one way channel model of communication, on a model--the open access model--where gatekeepers are unnecessary and only serve to hamper the effective operation of the communication network. Furthermore, they are unwieldy and would require massive resources to administer effectively.

Only the modified end-user model capitalizes upon the open access nature of the system and takes advantage of its ability to create a tolerant, diverse, international community in cyberspace. Only the modified end-user model marries the concerns of freedom of expression with recognition of the need to protect certain members of society from the harm that can exist in cyberspace. It does this by encouraging individual responsibility for information gatekeeping, and by creating environments where dialogue and the uninhibited expression of ideas is the norm.

The key to attaining the joint goals of control of the information on the Internet and the preservation of freedom of expression, is equipping individuals with the tools and knowledge necessary to make informed decisions. Censorship, as John Stuart Mill stated, does not kill heretical ideas. It merely suppresses them.114 They live on in the minds of those individuals who believe in them and they are passed on among their fellow believers. The Internet is the World's Agora,115 a place where all peoples of the world can meet and interact on an equal level. It is a forum for dialogue where all ideas can be questioned and either strengthened or weakened based on their ability to meet the criticisms aimed at them. The Internet is the perfect medium for the individual to fulfil the three core values of freedom of expression, the search for truth, participation in democracy, and individual self-fulfilment as it places into the hand of every individual the power to communicate with all other users quickly, effectively, and inexpensively. It has created virtual communities where individuals who have never met face to face have developed friendships. The Internet is blind to issues of race, colour, creed, gender, social status and religion. All users, therefore, can interact with each other as true equals.


ENDNOTES

1. Infra note 3 at 21.

2. A note about the citation system which will be used in this essay. As a legal essay all attempts have been made to conform to the citation rules found in the Canadian Guide to Uniform Legal Citation, 3rd ed. (Scarborough: Carswell, 1992). However, as many of the sources were found on the Internet it has been necessary to modify the citation to some degree. Where an Internet source is used it will be cited in accordance with the Guide where possible, but the Internet reference will also be included. The Universal Resource Locator (URL) format will be used to indicate all addresses. A World Wide Web address, for example, takes the form "http://comp.location.org/dir/file.html" (minus the quotation marks). All efforts have been made to be as precise as possible in pinpointing the location of information. However, it is not possible to use precise page references when citing online documents as the page numbering may be different depending on the computer used to view the information. The Internet is always fluid and although all efforts have been taken to ensure addresses are current, the location of particular documents may change with time and may also be found elsewhere.

3. J.S. Mill, "On Liberty" in John Gray, ed., On Liberty and Other Essays (Oxford: Oxford University Press, 1991).

4. Ibid at 22.

5. Ibid at 25.

6. Ibid at 34.

7. Ibid at 37.

8. Ibid at 37-8.

9. Ibid at 38.

10. Ibid at 39.

11. Canadian Charter of Rights and Freedoms, Part I of the Constitution Act, 1982 being Schedule B to the Canada Act 1982 (U.K.), 1982, c. 11.

12. See Ford v. Quebec (A.G.), [1988] 2 S.C.R. 712 at 765-7 and Irwin Toy Ltd. v. Quebec (A.G.), [1989] 1 S.C.R. 927 at 976 [Hereinafter Irwin Toy].

13. See especially, R. v. Butler, [1992] 1 S.C.R. 452 [hereinafter Butler] where the court analysed arguments against protection of pornography under s. 2(b) of the Charter on the basis that it caused harm to women and children. The Court considered the argument that pornography caused harm. Harm was interepreted as a predisposition of persons to act in an anti-social manner. This would include, for example, the physical or mental mistreatment of women by men. Anti-social conduct was defined as conduct which is recognized by society as being incompatible with its proper functioning. (Butler at 485) The court, however, was unable to find a concrete causal relationship between pornography and harm to women, but did recognize that there was a substantial body of evidence which made it a reasonable conclusion. (Butler at 479) The court did rule that depictions of children engaged in sexual behaviour was not protected by s. 2(b) of the Charter. (Butler at 485).

14. See especially, R. v. Keegstra, [1990] 3 S.C.R. 967 [hereinafter Keegstra]. In Keegstra the court upheld s. 319(2) of the Criminal Code, R.S.C. 1985, c. 46, [hereinafter Criminal Code] which prohibits the wilful promotion of hatred, on the basis that although a prima facie violation of s. 2(b) of the Charter, the infringement can be reasonably and demonstrably justified under s. 1 of the Charter. Chief Justice Dickson writing for the majority, held that the type of expression prohibited by s. 319(2) was a special category of expression that strayed from the spirit of s. 2(b) by undermining democratic values and the principle of equality of all citizens (Keegstra at 766). Therefore, infringing upon freedom of expression by prohibiting such expression was justifiable under s. 1.

15. Irwin Toy, supra note 12 at 969.

16. Irwin Toy, supra note 12 at 970.

17. Butler, supra note 13 at 489.

18. See Electronic Frontier Canada (EFC), Press Release, "Net Censorship Backfires" (1 February 1996) (http://insight.mcmaster.ca/org/efc/pages/pr/efc-pr.01feb96.html).

19. It is important to qualify this statement. Information is only available when the provider of it decides to make it available publicly. All computer systems contain at least a minimum degree of security protection which permits users to decide which information is to be kept private, and which information is to be made public. For example, a bank could use the same computer for keeping its account records as it uses for its publicity on the Internet; the difference being that the account information would be made accessible only to authorized users, whereas the publicity information would be available to all users. However, once made available publicly, there is, by definition, little control over who accesses it.

20. J. Berman and D.J. Weitzner, "Abundance and User Control: Renewing the Democratic Heart of the First Amendment in the Age of Interactive Media." (1995) 104 Yale. L.J. 1619.

21. Ibid at 1622.

22. Ibid at 1622.

23. Ibid at 1623.

24. Ibid at 1622-3.

25. Ibid at 1623.

26. Ibid at 1624.

27. Ibid at 1624.

28. Ibid at 1624.

29. Ibid at 1624.

30. Some clarification is useful at this point. With basic Internet access usually comes the ability to send and receive electronic mail, read and post to the USENET newsgroups, the ability to telnet or log on to other computers as if a local user, and the ability to surf the Web. The ability to create a Web page, or to place information on the Internet in a manner immediately accessible to all other users on demand, may involve an additional surcharge and is not automatic with all services. This is because valuable hard disk space is required on the Internet Service Provider's (or ISP--the term used to denote an individual or company in the business of providing Internet access) computer for storage of the information and this computer must be connected to the Internet at all times. Dalhousie University students, for example, automatically receive a certain amount of hard disk space which can be used for the purpose of designing a web page. My web page can be found at http://is2.dal.ca/~dswayze.

Absence of the ability to create Web pages, does not relegate an Internet user to the status of consumer only, however. Through E-mail the user can provide information to any other user on the Internet. But, more importantly through the use of USENET newsgroups, a user can post messages to virtual bulletin boards. An analogy is a bulletin board in a public space, such as a shopping mall, used for the purpose of posting for sale messages. The message is posted where anyone with access to the public area can see it. The difference with the Internet is that the messages take the form of conversations with everyone posting their opinion on a bulletin board or "newsgroup" or "group" in Internet speak, publicly available to all users of the Internet. A regulatory structure has been developed with literally thousands of newsgroups available divided up by topic (as of April 28, 1996 Dalhousie University made 7,103 such newsgroups available to its users) so that individual users can be selective as to which boards or "newsgroups" they participate in. Through USENET a user can take on this dual role of consumer and provider although the message is only on the board for a relatively short period of time, unlike the Web where the message is available for as long as the provider permits.

31. E. Krol, The Whole Internet User's Guide and Catalog (Sebastopol, CA: O'Reilly & Associates, 1992) at 11.

32. Ibid at 11.

33. B.P. Kehoe, Zen and the Art of the Internet, Rev. 1.0, (1 February 1992) (http://www.cs.indiana.edu/docproject/zen/zen-1.0_toc.html).

34. Ibid.

35. Electronic Frontier Foundation, EFF's (Extended) Guide to the Internet..., v. 2.3 (September 1994) (http://www.eff.org/papers/bdgtti/eegtti.html) at (http://www.eff.org/papers/eegtti/eeg_45.html).

36. Ibid.

37. Ibid.

38. Ibid at (http://www.eff.org/papers/eegtti/eeg_44.html#SEC45)

39. Supra note 35.

40. Infra note 49 at A9. China has taken this route requiring all Internet service providers (ISP) and users to register with the government. It is attempting to find ways of cordoning off the Chinese network in order to control the flow of information between China and the outside world.

41. Supra note 13.

42. See M. Horton & M. Moraes, "Rules for Posting to Usenet" (http://www.lib.ox.ac.uk/internet/news/faq/archive/usenet.posting-rules.part1.html) and periodically posted to the newsgroups, news.announce.newusers and news.answers.

43. Industry Canada, Illegal and Offensive Content on the Information Highway: A Background Paper by Gareth Sansom (19 June 1995) (http://insight.mcmaster.ca/org/efc/pages/doc/offensive.html).

44. Internet Relay Chat is analogous to virtual meeting rooms. Users can set aside a common portion of cyberspace for the purpose of having written conversations. Each user is identified by a name (sometimes pseudonyms are permitted) and they engage in typed conversations with each other. The comments of each user are displayed on the computer monitors of all participating users. Some chat "rooms" are private where a small number of users can engage in private conversation, excluding all others, whereas other rooms are public and open to all users. Through IRC users worldwide can converse with each other in real time.

45. For example, the Pegasus Mail E-mail program permits the creation of large mailing lists. Users create a list of individuals who they wish to E-mail and store that list as a file. They can then draft their message and instruct the program to mail the message to every address on the list. There is no limit to the length of the list. "Listservs" are another means of mass mailing. In a listserv a central E-mail distribution point is created. Subscribers to the list E-mail their message to the central distribution point which then automatically redistributes the message to all other subscribers to the list.

46. Infra note 47.

47. Little Sisters Book and Art Emporium v. Canada (Minster of Justice) (1996), 131 D.L.R. (4th) 486 at 556 [hereinafter Little Sisters].

48. For example, the address "docker.com" does not belie the fact that the particular site is located in Brandon, Manitoba. This is not always the case as the site "is.dal.ca" is clearly Canadian as it part of the domain "ca." The Internet is divided into domains, which act something like telephone area codes, helping to divide up the Internet so that common names can be duplicated, while still preserving their uniqueness, much like the same seven digit phone number may be used in different area codes. Other domains such as "org", "net" and "com" do not belie their location with such ease as they are used internationally. Each country does have a domain name. Canada's domain name is "ca". However, not all Canadian Internet sites use the same domain name. Simultaneously, very few sites in the United States use its domain name, "us." There are Internet directories and nameservers which permit one to find out the geographic location of any given site. The Internet, however, makes no distinction on the basis of geography and treats all sites equally. (see Supra note 31 at 28 and 350-2)

49. The Economist, "Urge to be 'modern' conflicts with fear of Net's unrestricted flow of ideas, passion" reprinted in The [Toronto] Globe and Mail (16 March 1996) A1 at A9.

50. See Customs Act, S.C. 1986, c. 1, ss. 58-71. These sections deal with the process of examination, assignment of a tariff, and appeals whereby customs officials can make a determination as whether the goods will be permitted to enter Canada and under what tariff. It also permits an appeals process where if the importer disagrees with the decision of Canada Customs.

51. See Customs Tariff, S.C. 1987, c. 41, s. 114. This section provides for a scheme whereby goods are classified according to Schedule VII of that act which assign to goods a particular number. Number 9956(a) is assigned to obscene materials. The obscenity definition for the purposes of Number 9956(a) is that used in s. 163(b) of the Criminal Code. Materials defined as obscene may be barred from entry into Canada. For more discussion of the customs regime see Little Sisters, supra note 47 at 496-500.

52. Criminal Code R.S.C. 1985, c. C-34, ss. 163 - 163.1, as am. 1993, s. 46, s. 1. These sections deal with prohibitions against the possession, distribution, and production of obscene material and child pornography.

53. Supra note 18.

54. Mirror sites are computers which contain the same information, but at a different location and, at a different unique Internet address.

55. Supra note 49 at A9.

56. Supra note 18. The "Canadianizer" (http://www.io.org/~themaxx/canada/can.html) was a web site created as a joke by a user in Toronto in an attempt to prevent the Americanization of the Internet. The site permits users to type in the address of other sites which it then Canadianizes with certain Canadianisms such as "eh", "G'Day", and "hoser" and displays the Candianized version to the user. For example, the phrase "Supreme Court of Canada" becomes "Supreme Court of the Great White North."

57. Supra note 18.

58. Supra note 47.

59. Supra note 49 at A9.

60. Supra note 49 at A9.

61. Supra note 49 at A9. Comments by Mr. George Yeo, Minster of Information of Singapore.

62. This can be easily demonstrated with some simple Internet utilities such as TrumpHop, put out by the makers of the Trumpet Winsock Internet connection program. TrumpHop allows a user to trace the path that information takes from the user's computer to the remote computer. It displays the names of those computers on the network that the message passes through. I attempted this on April 30, 1996 and traced the path between Dalhousie University and the site, pikuolis.omnitel.net in Lithuania where a friend of mine is located. The message passed through computers to the CA*Net backbone, the main Internet line in Canada, down to the MCI backbone in Boston, then along the MCI backbone to New York, over to Sprintnet, before crossing the Atlantic to Lithuania. Usually the most direct path is taken, but it is not unusual for Canadian messages with international destinations to pass through the US en route.

63. Butler, supra note 13. In Butler, a video store owner in Winnipeg, was charged under s. 163 of the Criminal Code for possessing, selling, and exposing to public view, obscene material. Butler was a "provider" of a service, i.e. the sale and rental of pornographic video tapes.

64. See Zachary Margulis, "Canada's Thought Police" Wired 3, no. 3 (March 1995), 92 at 94. Also at http://insight.mcmaster.ca/org/efc/pages/wired-3.03.html.

65. A BBS is often setup up by an amateur computer enthusiast in their home or office and consists of a computer connected to a telephone line which can be accessed by other computer users through a modem. Through the BBS, users can usually send E-mail to other users of the BBS, engage in discussion through bulletin boards much like newsgroups, and swap files including programs, and not unusually, pornographic images. BBSes were the precursors to the Internet for the home enthusiast in that they were often setup by amateurs to link together enthusiasts in the immediate area. They were designed to be inexpensive and act as an electronic meeting place. With time they developed their own network, Fidonet, and some are now connected to the Internet. A few, have made a business out of their BBS, charging subscription fees for access to their files.

66. Supra note 64 at 94.

67. See A.W. Branscombe, "Anonymity, Autonomy, and Accountability: Challenges to the First Amendment in Cyberspaces" (1995) 104 Yale L.J. 1639 at 1652-3. Anne Branscombe discusses the Amateur Action case with particular reference to the US "community standards" analysis for judging obscenity. She notes that the case against Amateur Action was initiated in Tennessee where the community standard was more conservative.

68. Personal communication. As an employee at the Dalhousie Law School computer lab and a frequent advisor on computer usage to friends and acquaintances, this author has heard the use of this phrase frequently.

69. See M. Ethan Katsh, "Rights, Camera, Action: Cyberspatial Settings and the First Amendment" (1995) 104 Yale L.J. 1681 at 1695. In Note 43. Katsh describes events at an American college where students complained about the availability of gay and lesbian material through computers in the undergraduate computer lab. They were upset that such material was available on the university computer network and being funded by the college through university tuition. In fact, the materials were being accessed through the Internet from the site where it was being stored, some 300 miles away. The college, itself, had nothing to do with the availability or funding of that material. This demonstrates a common misconception as to where the information is physically stored.

70. Due to the sheer enormity of information being transmitted on USENET, many providers are dedicating computers solely to the task of handling the messages. Dalhousie did this in February of 1996. The computer, news.dal.ca retrieves, stores, and distributes newsgroup messages to the university community. It also handles the posting of message by local users to USENET and distributes them around the world through the Internet. See "The Premium Dial-up Service: Update 96/02/01" at http://spike.ucis.dal.ca/CommServ/PDS.html.

71. Electronic Frontier Canada, "Electronic Frontier Canada Opposes...Compu$erve (sic) Censorship" (3 January 1996) (http://insight.dcss.mcmaster.ca/org/efc/pages/cis/).

72. Ibid.

73. CompuServe, Press Release, "CompuServe® Introduces New Parental Controls; Lifts Global Suspension of Newsgroups" (13 February 1996), (gopher://insight.mcmaster.ca/00/org/efc/events/cis/cis.pr.13feb96.

74. Supra note 43.

75. Supra note 43.

76. Canadian Press wire story, (27 May 1992) (QL)

77. Personal experience. At the time, I was just completing my degree at Brandon University. A number of alt. newsgroups were received prior to the ban. Following the ban, all newsgroups in the alt. hierarchy were blocked and unavailable to Brandon University users.

78. Supra note 20.

79. See Butler, supra note 13 at 485. In Butler, Justice Sopinka defined as obscene the undue exploitation of sex:

...the portrayal of sex coupled with violence will almost always constitute the undue exploitation of sex. Explicit sex which is degrading or dehumanizing may be undue if the risk of harm is substantial. Finally, explicit sex that is not violent and neither degrading nor dehumanizing is generally tolerated in our society and will not qualify as the undue exploitation of sex unless it employs children in its production.

80. Net-link is one such company. They promise 15,000+ newsgroups free from censorship for the fee of $10.00 US per month to any Internet user with the proper connection and software. (http://www.net-link.com).

81. Supra note 49 at A9.

82. Supra note 43.

83. A. Rinaldi, "Electronic Communications (E-mail, LISTSERV groups, Mailing lists, and Usenet)" at the site The Net: User Guidelines and Netiquette (http://rs6000.adm.fau.edu/rinaldi/net/elec.html).

84. Supra note 42.

85. See Internet Learning Consultants, "ILC Glossary of Internet Terms" (http://www.matisse.net/files/glossary.html).

86. Ibid.

87. Supra note 67 at 1657-8.

88. Supra note 3 at 25.

89. Supra note 43.

90. Supra note 43.

91. A. Leonard, "Bots Are Hot" Wired 4 no. 4 (May 1996) 114.

92. Supra note 43.

93. Supra note 91 at 116.

94. Supra note 91 at 166.

95. Supra note 91 at 171.

96. Supra note 43. USENET is only capable of dealing with text. It cannot transmit images, at least not in a form where a user merely has to click on a posting and the image will appear. Images are binary files which means that they consist of streams of ones and zeros. Text, is also binary, but the one's and zeros are grouped in such a way that the computer automatically recognizes each grouping for what it is, a particular character. In order for images to be posted and made available on USENET, they are converted to text using a program call uuencode. The text, by itself, is meaningless and anyone who viewed it would see nothing but gibberish. However, when a user wishes to view the image, another program uudecode, converts it to a binary file which the user can then view on his or her computer monitor. Although not impossible, most news reading programs do not automatically convert the text to a binary file making it immediately viewable. Usually, the user must instruct the newsreader to download and convert the text, then launch another program to actually view the image.

97. Supra note 43.

98. Telecommunications Act of 1996, Pub. L. No. 104-104, 110 Stat 56 (1996).

99. See List of Plaintiffs and Defendants at http://www.cdt.org/ciec/plant_def.html.

100. See American Civil Liberties Union, Press Release, "ACLU v. Reno: A Chronology" (20 March 1996) (http://www.aclu.org/news/n032096.html).

101. See paragraph 2 of the ACLU's Complaint at http://www.aclu.org/court/cdacom.html.

102. See Hotwired, "Hotwired Special -- Free Speech First -- Join the Suit" (http://www.hotwired.com/special/lawsuit/join.ceic.html).

103. See Electronic Frontier Foundation, "The Blue Ribbon Campaining for Online Freedom of Speech, Press and Association," (http://www.eff.org/pub/Graphics/Icons/BlueRibbon/README.blueribbon).

104. Supra note 102.

105. Supra note 43.

106. Feminists for Freedom of Expression, Feminism and Free Speech: The Internet by J. Kennedy (1996) (http://www.well.com/user/freedom/internet.html).

107. Supra note 20.

108. Supra note 106.

109. Supra note 73.

110. Supra note 73.

111. Supra note 106.

112. Supra note 106.

113. Supra note 43.

114. Supra note 3 at 37.

115. Supra note 67 at 1670.

BACK to David Swayze's Home Page



This page ©David Swayze 1996.

Last updated: July 3, 1999 8:23:14 AM CDT.