|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
![]() |
This chapter discusses tools for the end user, here the party responsible for making decisions on behalf of the child or youth in question. Thus, the "end user" may be a parent in the case of a home or family, a maker of school policy (e.g., a school principal or a school district), a maker of policy for a public library (or library system), or some other similar individual or body. The focus on tools for end users is important because such tools are intended to empower end users by providing a wide range of options for the children in their care, more or less regardless of what other Internet stakeholders do or do not do. (The only exception concerns instant help, in which the user is the child seeking help.) provides a preview of this chapter.
Filters are at the center of the debate over protecting children and youth from inappropriate sexually explicit material on the Internet. A good filter allows a specific filtering policy to be implemented with accuracy, has enough flexibility, and can be implemented with a minimum of undesirable side effects. describes the dimensions of choice that GetNetWise identifies for filters.
Today, Internet access is largely unrestricted. That is, a user who does not take some explicit action to limit the content to which he or she is exposed has access to any content that the Internet provides through Web pages, e-mail, chat rooms, and the like. This report uses the term "filter" to refer to a system or service that limits in some way the content to which users may be exposed. The vast majority of filters block access to content on specific Web sites (though these sites may be specified as a class). Other filters also block access on the basis of keywords appearing either in a user's query to a search engine or contained in the about-to-be displayed Web site.1 Some filters provide the capability to block more broadly, so that an individual may be denied access to other common Internet services, such as interactive services (e.g., e-mail, chat rooms), Usenet newsgroups, file downloading, peer-to-peer connections, or even e-commerce with credit card usage. Users who wish to use a filter have a number of technical options:
Some services allow multiple "login names" or "screen names." A screen name is an online identity, similar to a CB radio "handle." Each online session uses a single screen name, and families can choose not to give the adult or "administrative" screen name password to youth. An online account may have multiple screen names, and a user with appropriate privileges (usually associated with paying for the master account) can create arbitrary screen names at will for himself or someone else on his account as long those names are not already in use. With each name can be associated unrestricted access or more limited access to online content (which may include both Internet and proprietary content). In the case of America Online (AOL), a parent can identify the age of the child for whom he or she is setting up a screen name. AOL then puts into place default limitations based on the age of the child, which the parent can then adjust if necessary. Such limitations might include, for example, Web access only to age-appropriate content or to everything except explicitly mature themes, receipt of e-mail only without file attachments or embedded pictures, and access only to chat rooms intended for children (or no chat room access at all).
Filters can be used to block certain incoming inappropriate information (an application that is the most common use of filters), to block access to certain Internet services (e.g., file downloads), or to block selected outgoing information. All technology-enforced methods for blocking access to inappropriate information require a determination that certain pieces of content are inappropriate.6 Content can be deemed inappropriate on the basis of the methods discussed in Section 2.3.1. Many filter vendors establish lists of "suspect" Web sites (compiled as a list of specific URLs and/or IP addresses) deemed sources of inappropriate content.7 The number of such sites may range from several hundred thousand to 2 million. In addition, many vendors establish lists of keywords (typically hundreds of words) that represent inappropriate content. Far fewer employ image analysis or statistical techniques to analyze text. In addition, techniques for textual and image analysis can be used to identify and block e-mail containing inappropriate content and for blocking outgoing content as well. For example, the technology that identifies inappropriate content by searching for keywords can also prevent those words (or some set of them) from being used in e-mail messages or in chat rooms. (In this case, the adult supervisor can augment the keyword list to include certain phrases that should not appear, such as specific addresses or phone numbers.) Filters are perhaps the most widely deployed of all technological tools intended to protect children from exposure to inappropriate material. The majority of schools have deployed filters,8 while around 25 percent of libraries filter at least some workstations.9 Through AOL's parental controls (), a substantial number of Internet-using children enjoy the benefits and endure the costs of filtering. However, as a percentage of all children using the Internet, the fraction whose Internet access is filtered apart from school usage is small.10 It is noteworthy that filters are increasingly common in corporate and business settings and thus affect the Internet use of adults.11 Many companies, driven primarily by concerns about productivity and time wasted on non-business Internet activities and about the possible creation of hostile work environments and the consequent liability, use filters prevent inappropriate use of company IT facilities.12
Denying access to inappropriate material through technological means, filters are intended to protect against both inadvertent and deliberate access. However, as discussed in Section 2.3.1, all filters are subject to overblocking (false positives, in which filters block some appropriate material from the user) and underblocking (false negatives, in which filters pass some inappropriate material to the user). While the issue of underblocking and overblocking should not, in and of itself, rule out filters as a useful tool, the extent of underblocking and overblocking is a significant factor in understanding and deciding about the use of filters.13 There is no agreed-upon methodology for measuring a filter's effectiveness, as might be indicated by an overblocking rate and an underblocking rate (discussed in Section 2.3.1).14 Filter vendors sometimes provide estimates of overblock and underblock rates, but without knowing the methodology underlying these estimates, the cautious user must be concerned that the methodology is selected to minimize these rates. (The discussion in Box 2.8 illustrates some of the problems in estimating these rates. Note further that the lists of blocked Web pages change constantly, with both additions and subtractions made regularly.) Underblocking results from several factors:
Overblocking arises from three factors:
The above three factors are basic to the fundamental imperfection of the filtering process. A fourth factor that can lead to overblocking results from the ways in which some filtering systems are implemented. If a filter blocks sites on the basis of the IP addresses of adult-oriented, sexually explicit sites, and those sites are hosted on a server that makes use of IP-based virtual hosting (described in Chapter 2), other non-adult sites hosted on that server (and sharing those IP addresses) will be blocked.16 Note an important distinction between overblocking and an overly broad scope of blocking (i.e,, an overly broad blocking policy). Overblocking is inadvertent and results from the inability of the automated systems for blocking to perfectly track human decision making. The model human decision maker, examining overblocked material, would conclude that the material should in fact have been free to pass. An example would be a search for "beaver dams" that results in pages being blocked because the word "beaver" is present on the page. An overly broad policy is more subjective, and results from a disagreement between the end user and the human decision maker about what information the end user should be allowed to receive. From the perspective of the end user, a certain piece of material is blocked inappropriately. However, upon examination of that blocked material, the human decision maker concludes that the blocking decision was proper. For example, a student may wish to search for information on marijuana. But Web sites containing the word marijuana may be blocked because of a policy decision to block information about drugs.17 The effectiveness of a filter also depends on whether its use is enforced at all sites available to a child. In a specific venue, filters will block some material that some parties deem inappropriate--and there is a reasonable argument to be had over whether the blocking that occurs is worth the cost of overblocking. But it is impossible for a filter deployed in a school to block material sought in a cyber-café or at home, and filtering limited to schools and libraries will not prevent the access of children to inappropriate sexually explicit material if they are determined to search for it and have other venues of access. The most common unfiltered venues are home Internet access or Internet access provided at a friend's home. (Filtering at home is not the norm,18 even though a significant fraction of U.S. youth do have Internet access at home,19 a point well represented by the adolescents to whom the committee spoke during its site visits.) Filters that are not based on real-time content-based identification of inappropriate content can be circumvented by users in a number of ways,20 both direct and indirect:
Note that defeating a filter can be more difficult when the filter is server-based, because the circumventer does not have direct access to the system on which the filter resides. Further, note that because a child-oriented content-limited ISP would most likely be chosen by families interested in filtering for fairly young children (say, 10 and younger), the likelihood that the ISP's restrictions could be circumvented is substantially lower than it would be if users included older youth. In addition, inappropriate material (sexually explicit and otherwise) can flow to a child through routes other than Web sites--peer-to-peer file transfers such as those available through Gnutella, e-mail attachments, and so on. While some filters can be set to block the use of such routes, such blockage is indiscriminate and insensitive to the content carried on these routes.
It is also true that the content provider could provide ways of circumventing filters. For example, misspelled sexual words (e.g., "pormography," "dicck," "0rgy") may be used in a site's metatags to circumvent filters that search for keywords. As a general rule, though, commercial vendors of sexually explicit material argue that they are not economically motivated to expend a lot of effort to get through these filters, because children are unable to pay for such material. Those providing other types of content without commercial intent may be more likely to attempt to circumvent filters. Many of these methods of circumvention do not apply to filters that are based on real-time content-based identification of inappropriate content. However, filters that are based on real-time content-based identification are not nearly as commonly available as filters based on lists of inappropriate sites and keywords. Furthermore, the technology of content-based identification is relatively sophisticated compared to that required for developing lists of sites and keywords, and hence is more difficult to implement properly. The effectiveness of label-based filters depends on the ubiquity of labels for Internet content and the willingness of the user to decide which labels indicate content that should be blocked. Label-based filters, such as those that incorporate PICS-compliant labels, are built into the major Web browsers, Internet Explorer (IE) and Netscape. However, PICS-compliant labels are not in wide use as of this writing (April 2002, but see Section ). Both IE and Netscape provide the user with the option of allowing or blocking unlabeled material, with the consequence that users of these browsers can either have access only to a very small segment of the Web (if unlabeled material is blocked) or enjoy minimal protection from inappropriate material (if unlabeled material is allowed). For this reason, label-based filters today do not work particularly well in reducing exposure to inappropriate material unless one is willing to tolerate a very high rate of overblocking. Whether they will work more effectively in the future depends on the extent to which Internet content will be labeled. While filters are designed to reduce children's access to inappropriate material on the Internet, there are some interesting psychological and social phenomena related to their use. In most of the schools and libraries that the committee visited, teachers, librarians, and administrators told the committee that filters played a very small role in protecting students and library users from inappropriate material, largely because most of these students and library users had unfiltered Internet access somewhere else (usually at home). (Of course, for the significant fraction of students without non-school access, such comments did not apply.22) Nevertheless, the school or library filter served a useful political purpose in forestalling complaints from the community about "public facilities being used for shameful purposes."23 In virtually every school the committee visited, avoiding controversy and/or liability for exposing children to inappropriate sexually explicit material was the primary reason offered for the installation of filters.24 In a public library setting, filters have also been used to prevent the display of material that would be regarded as offensive to other patrons walking by. For example, one technique used to shock other patrons is to display an adult-oriented Web site on a public Internet terminal and to "hide" it behind the terminal's screen saver (which places some innocuous image on the screen on top of whatever is on the screen). When an unsuspecting user clears the screen saver image, he or she is suddenly surprised by a sexually explicit image.25 Teachers and librarians can derive substantial benefit from filters. For example, most schools and libraries have acceptable use policies (AUPs, as discussed in Chapter 10) that forbid use of school or library computer resources for certain purposes, such as viewing sexually explicit sites. In the absence of a filter, a teacher or librarian must confront the user and inform him or her that such use violates the AUP. For many, such confrontations can be unpleasant and can provoke anxiety. To the extent that a filter reduces the possibility of a student or library patron viewing such sites, it also reduces the frequency of such confrontations. In addition, given community pressures for teachers and librarians to supervise or monitor the use of Internet resources by students and library users, filters reduce the burden on teachers and librarians to police usage and free time for other, more productive activities. Finally, many teachers and librarians are themselves uncomfortable in viewing certain types of inappropriate material26--and in the committee's informal discussions in its site visits, this was especially true for many sexually explicit images. Even apart from the claimed benefits of preventing exposure to inappropriate material, filters can offer children other benefits. In the school environment, teachers reported that filters helped them to keep students "on task" while doing school-related Internet work by reducing the distractions that might otherwise be available to them (the students); provides some data on the extent to which filters may keep students on task. A number of younger students with whom the Committee spoke during various site visits thought that the parental use of filters (generally AOL's parental controls) was a positive indication of their parents' concern, independently of whether they felt the filters were effective. (According to the Kaiser Family Foundation, about two-thirds of teenagers and young adults support the Children's Internet Protection Act when provided with a description of it. This view does not vary among those who go online a lot or who have been denied access to Web sites because of filtering.27) Because they confine the user only to material explicitly considered appropriate, child-oriented content-limited ISPs provide the greatest degree of protection for children. By design, their approach seeks to minimize underblocking at the expense of overblocking--all questionable exposure is blocked or at least monitored. For example, they evaluate for educational or developmental benefit every Web page that is accessible to a child. Under these circumstances, the likelihood of exposure to inappropriate content is very low, especially with respect to sexual imagery. Interactive dialog in chat rooms and on bulletin boards is monitored, so that the first posting or sending of inappropriate messages can be censured. (Such censure also provides observers with a lesson in the consequences of such behavior.) The identities of e-mail and IM senders is not monitored, but because they are restricted for the most part to users of the service, the universe of those who might engage a child in e-mail or IM dialog is much smaller than on the entire Internet. Perhaps of equal or greater benefit, at least from the perspective of some adults, is that the content accessible to kids has definite educational or developmental value, rather than being simply not inappropriate.
Filtering is based on the premise that a party or parties other than the child himself or herself decides what content is inappropriate. In general, the first pass at determining potentially inappropriate content is the vendor of the filter or the content-limited ISP. For those who choose to accept without change this determination (as is the case with subscribers to content-limited ISPs or those who do not wish to customize further), this initial determination stands. For example, a school that installs a filter without additional customization accepts the determination of the vendor about what is or is not appropriate. Even if it does undertake customization, it uses the vendor's determination as its point of departure, and detailed editorial control on a site-by-site basis for all sites in the vendor's database is not possible in practice. To accommodate those who wish to customize the characterization of inappropriate material for their own individual or institutional needs, filter vendors usually offer two options (which may be combined or implemented by themselves):
The vendor's characterization of inappropriate content is quite significant, as it is at the very least the primary point of departure for a user's customization (described below) even when such customization is possible.28 Filter vendors have many incentives to err on the side of overblocking and few to err on the side of underblocking. Based on its site visits, the committee believes that the reason is that schools and libraries, which are the largest users of filters for purposes of this report, tend to receive many more complaints from parents and the community about sites that are not filtered (i.e., complaints about underblocking) than about sites that are filtered improperly (i.e., complaints about overblocking).29 In the various site visits conducted by the committee, only a few students or parents reported making a formal complaint to the school about a site needed for research or schoolwork that was blocked by a school's filter, even though they (mostly high school students) often reported that information on the blocked sites might have been useful for legitimate academic research purposes.30 (The same was not true with most teachers, who reported that educationally relevant sites were blocked regularly. Still, in a number of cases, they were able to use their supervisory privileges to obtain access to blocked sites.) And, given that schools and libraries install filters largely to forestall complaints, it is clear that filters that do not generate complaints would be highly preferred. As for label-based filters, the labeling party can be either the content creator or any third party. However, it is the adult or adults directly responsible for determining what a child should or should not see who make the actual decision about how content labeled in a certain manner should be handled. Because the vendor's philosophy regarding inappropriate material is the primary determinant of what will and will not be blocked, trust is a fundamental element in the user's selection of a filter. That is, the user places considerable trust in the vendor's judgment about what is and is not appropriate (or in the case of labeling, places trust in the labels determined by various content raters). Thus, a person who wishes his or her religious values to be reflected in the content that is accessible to his or her children might choose a filter or a content-limited ISP that is sold by a firm with similar religious commitments. A person who wishes his or her children's Internet experience to be limited to positive, developmentally appropriate, and educational material may choose a filter or a content-limited ISP that explicitly screens content for such criteria, rather than another vendor that might screen content for inappropriateness. Finally, "viewpoint discrimination" (discussed in Chapter 4) generally cannot be practiced by public institutions, but the law in this area is currently unclear. In particular, it is not clear how a court would decide whether a public institution's use of a particular filter vendor's determinations of inappropriate material constitutes such discrimination--for instance, where the line is drawn between viewpoint discrimination and content discrimination, and what weight should be given to the extent to which the institution relied upon the filter vendor's settings. It is also not clear to what extent public schools, as opposed to public libraries, may engage in certain kinds of viewpoint discrimination.
Server-side filters can be easier to use than client-side filters, if only because they do not require installation on the client. Nevertheless, almost all filters allow some degree of customization to a parent's (or other adult supervisor's) requirements. Filters can (but do not necessarily) allow flexibility in many dimensions:
Many filtering products add a variety of features meant to offer parents, teachers, and others even further control. Some of these include:
In general, flexibility adds to the complexity of the filter--usually in its initial configuration and sometimes for its use. For example, there is debate about whether consumers prefer less nuanced granularity and simpler choices, or prefer to have detailed control over their filtering. Some products have garnered praise for offering a simple age-by-age default set of parameters on what to block that may then be overridden with more levels of nuance. In a number of filter-using sites visited by the committee, the flexibility of an installed product (especially the ability to unblock selected Web sites) was not used by administrators and teachers. In some cases, they were simply unaware of the capability; in others, they lacked either the time or the knowledge to do so. In still other cases, the ability to unblock sites was limited to district-level administrators (rather than the local administrators or teachers). During site visits, a number of teachers told the committee that requests sent to district-level administrators to override sites often met with resistance, and were slowly acted on and often refused. Furthermore, to the extent that flexibility is tailored for different individuals (e.g., for middle school students versus high school students), identification of these individuals is required--and the appropriate policy must be mapped to the access points of those individuals. An additional dimension of functionality is the provision of explanations for blocking. A site that is blocked for no apparent reason has a lower perceived legitimacy than a site that is blocked for a stated reason. For example, many filters tell the user that a site has been blocked because it falls into Category X, where X may be pornography, sex education, weapons, violence, and so on. By contrast, a site that is blocked simply with the message "access denied" does not provide the child with useful feedback, and may increase his or her motivation to go to a nonblocked access device to retrieve the information. Note that filtered search engines provide the lowest level of transparency of all--because a filtered search engine never returns even a link to content that is deemed inappropriate, the user has no way of knowing what has been filtered out or even that anything has been filtered out.
Financial Costs As with all technology deployments, the financial costs of using filters can be divided into acquisition costs and maintenance costs. Acquisition costs can be quite low, especially for server-based filters in which an installation of the filter at each access point need not be undertaken. Maintenance costs of server-side filters are usually absorbed into a per-year per-seat charge. However, payments to the vendor are not the only cost, as some on-site effort must be made to ensure that filters are working properly. On-site technical support is generally necessary. Management of the environment must be taken into account as well--in particular, teachers, librarians, and parents may be faced with managing requests to unblock sites that are blocked. In an institutional environment, there are costs of teaching the responsible adults what the filter can and cannot do, and providing training that familiarizes them with operating in a filtered environment. When filtering impedes their own legitimate searches for information, they must know how to obtain that information despite the presence of filtering. And, for those institutions that rely on centralized (e.g., district-based) administration of the unblocking function, the staff time needed to manage these decisions can be significant (and hence costly). Use of content-limited ISPs appears to entail fewer financial costs than the use of server-side or client-side filters. Because the major selling point of using a filtered ISP is that the user gets to delegate to another party all of the responsibilities of deciding upon and enforcing a filtering policy, it is likely that users will be comfortable for the most part with the default policy of the filtered ISP. Thus, the costs of filtered ISPs for this class of users will be relatively small. Also, filtered ISPs makes the cost of updating the filtering algorithm or database invisible to most users. One trend pushing toward the lowering of filtering costs is that the basic technology of filtering is increasingly available in certain common hardware products. For example, a variety of hardware routers for home or small office use (generally to support a small local area network at a home or office site) have native site-based filtering capabilities (that is, they have the ability to exclude traffic from specified sites). Some also have the ability to search for objectionable keywords embedded into site URLs. If trends toward hardware-based filtering continue, such filtering may well become ubiquitous. If and when vendors of these hardware products provide these routers with lists of sites to be blocked, with updates as a service to purchasers, the cost may well drop significantly. Restrictions on Information Flow As noted in Chapter 5, it is likely that in most communities there would be a rough consensus on some kinds of sexually explicit material on the Internet that should be made unavailable to children. But on other kinds of sexually explicit material, such consensus would be unlikely. Moreover, some of such material would undoubtedly constitute protected speech according to established First Amendment interpretations. The discussion in Section points out that overblocking is inherent in any process that makes decisions about what to filter (so is underblocking, but underblocking is not a restriction on information). Thus, for inappropriate sexually explicit material that might loosely be classified as "for adults only," some material that should not be placed into this category will be--and will therefore be improperly blocked. Filter vendors often provide options for blocking entire categories in addition to the category of sexually explicit material: violence, weapons, pro-choice and anti-abortion material, gay and lesbian lifestyles, and so on. Much of the material in these categories does not fit the legal definition of material that is "obscene with respect to minors," but based in the default setting of many filters, would be blocked anyway. While this restriction is not a legal problem in the context of home use, it may present a problem in publicly funded institutions, which are constrained by the requirements and current interpretations of the First Amendment.32 In an educational context, the restrictions on information flow associated with filters may lead to substantial problems for teachers and librarians who are trying to develop useful and relevant educational activities, assignments, projects, and so on. Indeed, some teachers reported to the committee during site visits that sometimes their lesson preparations were hampered by the fact that their Internet access was filtered at school. In other cases, when they prepared a lesson plan at home (with unfiltered Internet access), they were unable to present it at school because a site they found at home was inaccessible using school computers. Restrictions on information flow may also reduce the benefits of the Internet as an information retrieval mechanism. Specifically, one of the advantages of the Internet is that it facilitates the comparison of how different sources treat a given topic. While it is true that there are often many unblocked sources for basic information (and hence blocking any one of these sources may not be critical in this context),33 advanced work in which the specific source providing information affects its presentation or credibility is more adversely affected by overblocking. Such might also be the case when alternative political points of view or analyses may be blocked as being inappropriate. Psychological Costs Another potentially major cost of filters is that their use reduces opportunities for young people to practice responsible behavior on their own. That is, to the extent that filters work as they are intended (i.e., they block rather than discourage access to material that may be inappropriate), children have fewer opportunities to choose--and thus fewer opportunities to learn how to make responsible decisions. Such children may well have greater difficulty in developing internal standards of appropriateness. In addition, while some youth have reported that the use of filtering by their parents makes them feel loved, others have reported that it makes them feel untrusted by their parents. Filters also create forbidden fruit--in this context, specific content that is made (more) desirable simply because it is inaccessible. A common response to forbidden fruit is to engage in more active and determined efforts to obtain it. Given the technological sophistication of some teenagers, these efforts often succeed. Even worse, from the standpoint of minimizing exposure of children to such material, the results of technical circumvention efforts are often widely circulated, with the ultimate effect of greater exposure to inappropriate material rather than less, at least within the immediate circle of individuals close to those with the necessary skills. The introduction of filters may also serve to create resentments and resistance among the children on whom they are targeted. That is, because filters are explicitly intended to limit one's freedom of access, it is entirely possible that introducing filters, especially into an environment in which unrestricted access was the rule, would create tensions and anger of the children against those responsible for the decision. This dynamic is likely to be most significant in a family environment, in which parental rules are generally negotiated to some extent with children. Finally, unfair treatment of youth can result from the use of filters. A young Internet user, knowing that the Web sites she or he is viewing are filtered, can make the reasonable assumption that what is not filtered conforms to parental or organizational policy (e.g., an acceptable use policy, as discussed in Chapter 10), and thus that access to those unfiltered sites is not restricted. However, because filters inevitably allow some inappropriate material to pass, this may not be a good assumption, and a child who relies on a filter that allows objectionable material to be viewed can get into trouble with parents or organizational authority. Infrastructure Because a critical issue in filtering is the extent of underblocking and overblocking, users are well advised to test in advance what may be improperly blocked or passed. Apart from this predeployment testing, source- or content-based filters require minimal infrastructure. However, label-based filters require content providers or third parties to cooperate in labeling content. To develop such an infrastructure, providers and third parties must have incentives to label content. The minimal success of labeling schemes for Internet content to date suggests that the present environment does not provide such incentives.34 Note also that labeling by third parties entails essentially the same type of effort that must be undertaken by filter vendors to develop lists or criteria for inappropriate content. For labeling to be useful, a large volume of information must be examined and rated; otherwise, the user is left with the choices described in Section . Recognizing that the primary impediment to the success of rating schemes is the extent to which Internet content is currently not labeled, the Internet Content Rating Association (ICRA) has undertaken a global effort to promote a voluntary self-labeling system through which content providers identify and label their content using predefined, cross-cultural categories. ICRA is a global, non-profit organization of Internet industry leaders committed to making the Internet safer for children while respecting the rights of content providers. According to ICRA's chief executive officer, ICRA hopes that over the next several years the most popular Web sites and portals, those accounting for the most Internet traffic will have labeled with ICRA. If these efforts are successful, ICRA labels () will be associated with sites that account for a large fraction of Web traffic, though not necessarily with a large fraction of existing Web sites. The operators of these sites will encourage their business partners and the sites they host to use ICRA labeling. (However, because these sites do not in general have a business relationship with other Web sites that might turn up through use of their search engines, these other Web sites cannot be expected to be labeled in general.) Another approach is to mandate--by government fiat--the labeling of all Web content. But such an approach involves a number of significant issues:
Apart from government-required content labeling, the widespread use of labels will turn on private incentives. Incentives can be positive--by labeling, a content provider or creator could receive some financial benefit, either directly or by attracting more parties to its content. However, the committee did not see a compelling business case for how content providers or creators can benefit commercially from labeling, and testimony to the committee indicated how difficult it is to develop child-friendly Internet businesses. Or, incentives can be negative--by labeling, a content provider or creator might receive immunity from prosecution (for example, for obscenity) for the content being labeled (e.g., as adults-only). Such a safe harbor might be particularly applicable in labeling of sexually explicit material (as discussed in Section 9.3). To date, child-centered content-limited ISPs are small enterprises, and many efforts to establish a viable business model for providing good, attractive, and educational content for kids have foundered, as noted in Chapter 10.35 Thus, it is an open question whether children will be able to take advantage of consistent, dependable, long-term service of this nature Note also that because content is explicitly vetted for appropriateness, it is likely that the content offered by such ISPs would be more limited--and hence more suitable for younger children whose information needs are generally less than those of older children. By contrast, certain Internet portals, such as Yahoo and Lycos, have search engines that search only within the appropriate (and educational) child-oriented universe of content. Available for free, these search engines return only links to appropriate and educational content, and as long as the child does not surf outside these links, a responsible adult can have confidence in his or her child's activity.
Image-Only Filtering Visual imagery often results in a highly visceral impact compared to textual descriptions of the same image. As discussed in Section 6.3.3, males tend to respond to visual sexual imagery more than females do. And, as a general rule, sexually explicit text does not generate nearly the same controversy that sexually explicit imagery generates.36 A filter that blocks images on Web pages that have been determined to be inappropriate rather than all of the content of the Web pages themselves is thus well suited to meet this concern. Most of today's filters block specific Web pages, based on criteria established by their filtering policies. But there is no technical reason that the filter could not instead block all images contained on those Web pages, while passing through all text on those pages. (However, icons and text rendered in image format, such as those in banner advertisements and sidebars, would be blocked as well. And, concerns about over-breadth of blocking would remain, so that given images of the Greek gods, Leonardo DaVinci's Vetruvian man, paintings by Rubens, and Michelangelo's David might still be blocked.) Such a filter addresses many of the concerns raised by students, teachers, and librarians about children who need information that would otherwise be blocked by a page-blocking filter; as a general rule, such information is presented textually and would be passed by an image-blocking filter. Of course, to the extent that the concerns of filter advocates involve text, an image-blocking filter is not helpful. A more sophisticated approach to filtering of embedded images would involve analyzing them. Very small images are likely to be only icons, and very small images (say 200 x 200 pixels) do not convey much excitement. The content of larger images could be analyzed using technology described in Section 2.3.1 and Appendix C, and if found to be sexually explicit, those images would be blocked (subject to all of the difficulties inherent in image recognition). Selective Degradation of Service As discussed in Chapter 8, it is possible to reduce the appeal of deliberate contact with inappropriate material. Such an approach changes the yes/no approach to filtering to one in which the user can still gain access to material that might have been improperly classified as inappropriate, but only at some cost. Two easy technical methods to do so are slowing down the speed with which images of offensive content are displayed or reducing the visual resolution (or the audio fidelity) of such images. Presuming that such content can be identified, a child who must wait a few minutes for an image to be displayed is likely to lose patience with it. Such a tactic is most relevant if the child knows that the image being sought is inappropriate--it reduces the immediate gratification usually available from Internet downloads, and it increases the risk of being discovered in the act. (It also provides more time for the child to reflect--Do I really want to do this? Am I being responsible?) Similarly, an image of significantly reduced resolution is far less appealing than one with high resolution. Another possible approach depends on penalizing the user after viewing inappropriate content by automatically logging out, freezing the computer so that a reboot is necessary, or simply delaying for several minutes the child's accessing of other Internet content. Approaches that depend on degradation of service force the child to make decisions about whether the cost and inconvenience of access are worth the appeal of accessing content that adults might deem inappropriate. Bundling Filters with Other Functionality Filters are a special-purpose tool. Parents and others who purchase and install filters or filtering services thus can be assumed to feel that the problems raised by unfiltered Internet access are worrisome enough to warrant such efforts. However, other individuals may be concerned but reluctant to foster resistance or resentment that their introduction might generate (as discussed under "Psychological Costs" in Section ). For such individuals, associating filters with packages that provide other useful features may make it easier to obtain the benefits of filtering. For example, parents wishing to obtain filtering services might subscribe to a content-limited ISP, and "sell" it to their children on the basis of additional content that the ISP would be made available to them. Warning Rather Than Blocking Built into any filter is a specification of content that should be blocked. Instead of blocking access, a filter could warn the child of impending access to inappropriate material, but leave it to his or her discretion whether or not to access the material. Because the child does have choices, such a feature would have pedagogical advantages with respect to helping children to make responsible choices, assuming an environment structured in a way to facilitate such assistance. (A feature to warn of impending access to inappropriate material might or might not be combined with logging of such access--a point discussed in Section below.) Opportunities for Internet Safety Education Because child-oriented content-limited ISPs are oriented toward providing information especially for kids, they provide unique opportunities for Internet safety education. For example, these opportunities might include material that provided context for children and explained concepts about judging the value and or validity of the site being flagged or blocked. Future Prospects Over time, filtering is likely to gradually improve, decreasing both underblocking and overblocking. However, these improvements will almost certainly be incremental rather than revolutionary, and users would be well advised to view with some skepticism claims of revolutionary improvement. (For example, the phenomenon of blocking breast cancer sites because a user performed a search for "breast" is now rare. However, the reason this particular error is no longer frequent is that many users complained about it, and breast cancer sites were specifically taken off the black list.37) One goal is quite unlikely to be met--the generation of a class of objectionable or inappropriate material from a single example. It would be highly desirable for a user who has received an objectionable image (for example) to be able to tell a filtering program, "I don't want to see any more stuff like this." But what counts as "like this" is virtually impossible to generalize from one example, which is why even the best training systems today require hundreds or thousands of samples of objectionable material to offer any hope of even a minimally adequate categorization of material.
Today's filters cannot be the sole element of any approach to protecting children from inappropriate sexually explicit material on the Internet (or any other inappropriate material), and it is highly unlikely that tomorrow's filters will be able to serve this role either. But they can be a useful element, as long as their limitations are kept in perspective. In particular, with or without filters, Internet-using children will have opportunities for encountering some non-zero amount of inappropriate material, and thus regardless of the benefits that filters do confer, they will have to internalize codes of conduct and appropriate online behavior if they are to be safe. Using a child-oriented content-limited ISP is approximately analogous to allowing a child to watch only selected videos on television, rather than broadcast or cable television. And, as in that case, the use of such a practice is most likely appropriate for fairly young children. However, as a child's Internet information needs outgrow what a kid-friendly service can provide, he or she will have to turn to other sources. Other sources--by definition--will provide information that is less thoroughly vetted, and will likely to involve exposure of the now-older child to some inappropriate information; however, an older child may well be better able to cope with inadvertent exposures to such material. Furthermore, there is no guarantee that the point at which a child's information needs outgrow a kid-friendly service will coincide with the point at which he or she can cope well with such exposure, and it is likely that the former point occurs earlier than the latter point. As for server- and client-side filtering, it is helpful to regard such filtering as "training wheels" for children on the Internet as they learn to make good decisions about what materials are and are not appropriate for their consumption. An adult who explains the purpose of the filter to the child (and different explanations are appropriate at different ages), and who can provide some in-person guidance when the child first encounters blocked material, is in a much better position to help the child internalize the rules than an adult or institution that simply installs the filter with no explanation or rationale either before or after the fact. Indeed, the latter situation is what the detractors of filters have in mind when they argue that the use of filters can lead to a false sense of security: a filter user (parent, library, school), knowing that a filter is in place, will then be tempted to assume that all is well, and then fail to exercise appropriate oversight or to take other measures when such oversight or other measures would still be necessary. Underlying much of the concern about the deployment of filters--even on a voluntary basis--is a fear that the creation of a technical infrastructure that supports filtering will inexorably, over time, lead to even stronger pressures for formal content regulation (a so-called "slippery slope" argument). Furthermore, even without the pressures for formal content regulation, those advocating the free flow of information are concerned that authorities (parents, schools, libraries, businesses, and others) will find the use of filters irresistible to block any kind of content or information that they find objectionable, and not just for children. (Just such a sequence of events was related to the committee in one of its site visits: a county-wide library system installed filters to block sexually explicit material from all patrons, not just children, though the concerns were first raised in the context of children's access to such material.)
1. Filters have some significant utility in denying access to content that may be regarded as inappropriate. However, many of today's youth have access to unfiltered Internet venues (e.g., at home, at a friend's house), and school and library filters do not block content accessed from these other venues. 2. All filters--those of today and for the foreseeable future--suffer (and will suffer) from some degree of overblocking (blocking content that should be allowed through) and some degree of underblocking (passing content that should not be allowed through). While the extent of overblocking and underblocking will vary with the product (and may improve over time), underblocking and overblocking result from numerous sources, including the variability in the perspectives that humans bring to the task of judging content. 3. Filters are capable of blocking inappropriate sexually explicit material at a high level of effectiveness--if a high rate of overblocking is also acceptable. Thus, filters are a reasonable choice for risk-averse parents or custodians (e.g., teachers) who place a very high priority on preventing exposure to such material and who are willing to accept the consequences of such overblocking. (For example, these individuals may be more inclined to take such a stance if the children in question are young.) Such consequences may include the blocking of some material that would be mistakenly classified as inappropriate sexually explicit material, and/or the blocking of entire categories of material that are protected by the First Amendment (a consequence of more concern to publicly funded institutions such as public libraries than to individual families). 4. Automated decision making about access is generally inferior to decision making with a responsible adult involved in the decision making process. Furthermore, to the extent that the content of concern is in multimedia formats and unaccompanied by textual descriptions, automated decision making is subject to a high degree of overblocking (identifying content as objectionable when it is benign) and underblocking (identifying content as benign when it is objectionable). 5. To the extent that Internet content is created or produced in real time (e.g., a live videoconference over Webcams), it will be impractical to insert a human reviewer into the decision making process about whether access to that content should or should not be granted--thus weakening the role that filters play. 6. Overblocking must be distinguished from overly broad blocking policies. Overblocking is a mistake--content is blocked that should not have been blocked, even in the judgment of the human being responsible for identifying content that should be blocked (the censor). Overly broad blocking policy represents a disagreement with that human being in which the content seeker asserts that certain content should be accessible and the censor believes that content should be blocked. 7. Based on information gathered in its site visits, the committee believes that filters are deployed by schools and libraries at least as much for political and management reasons as for the protection of children, because the deployment of filters enables staff to pay more attention to teaching and serving library patrons. 8. Because most filters are deployed to forestall complaints, and complaints are more likely to be received about underblocking rather than overblocking, filter vendors have more incentive to block content that may be controversial than to be careful about not blocking content that should not be blocked. 9. Transparency of operation is important, in the sense that filters that inform a user that a site is being blocked--and that provides the reason for blocking--are more likely to be seen as legitimate than those that do not provide such information. 10. The use of blocking filters does not promote the development of responsible choice in children. With the removal of the option of making certain choices, children are denied an opportunity to choose--and hence do not learn appropriate decision making skills from the fact of blocking. 11. Filters are a complement to but not a substitute for responsible adult supervision. Using filters without adult supervision and/or instruction for users in what constitutes appropriate behavior is not likely to result in children learning what is or is not appropriate behavior. Furthermore, filters cannot guarantee that inappropriate material will not be accessed.
Tools that provide monitoring of the Internet activities of children have been proposed by some as an alternative to filters in certain contexts.38 Tools for monitoring have not received nearly the same attention as filters, but are potentially controversial as well. describes the dimensions of choice that GetNetWise identifies for monitoring tools.
Monitoring, as a way of protecting youth from inappropriate content, relies on deterrence rather than prevention per se. In some cases, it is the threat of punishment for an inappropriate act that has been caught through monitoring that prevents the minor from behaving in an inappropriate manner. In other cases, "catching someone in the act" can provide an important "teachable moment" in which an adult can guide and explain to the child why the act was inappropriate, and why this content is on the Internet. Monitoring a child's use of the Internet is a generic approach that can be implemented in both technical and non-technical ways. Adult supervision of Internet use, a non-technical strategy, is discussed in Chapter 10. But the technical methods for monitoring someone's use of the Internet are many.
Monitoring tools can provide a variety of functions, enabling a responsible adult to review incoming and outgoing e-mail, instant message and chat room dialogs, Web pages accessed, and so on. Further, such tools may or may not provide notification to the child that monitoring is occurring. A monitoring tool can also use technology to identify inappropriate material (the same technology that filters incorporate) so that it can provide a warning to the child when he or she is about to be exposed to inappropriate material. The child can then decide whether or not to heed that warning. If the child does not, the monitoring tool may or may not provide a record of that access. Warnings can also be accompanied by making a log of access to such exposures or notifying a responsible adult about such access. Depending on the tool selected, monitoring tools can be used at home and in libraries or schools. One distinguishing characteristic is that monitoring tools generally require human responses to detections of access to inappropriate information, or at least the threat of such a response. Thus, a parent or librarian or teacher must be involved to talk to the minor engaged in inappropriate behavior--and in institutional contexts the cost of reviewing access logs and following up on these reviews would likely be substantial. For monitoring to be effective, usage must be tied to specific individuals. Thus, in an institutional setting, individual logins--which tend to be more common in higher grades than in lower ones--are necessary if monitoring information is to be acted on after the fact of inappropriate usage.39 (If immediate action is taken, individual login information is not needed, since an adult can simply walk over to the Internet access point and talk to the child in question.) The same is true in a home setting, especially with multiple individuals accessing one computer. Indeed, without the ability to associate a given Web access (for example) with a given child, individualized guidance--or punishment--cannot be provided. As with filters, monitoring is also increasingly common in corporate and business settings.40
Because monitoring tools do not place physical blocks against inappropriate material, a child who knowingly chooses to engage in inappropriate Internet behavior or to access inappropriate material can do so if he or she is willing to take the consequences of such action. However, the theory of monitoring is that knowledge of monitoring is a deterrent to taking such action. Note, however, that unless warnings are given repeatedly and in different forms, users are known to habituate rapidly to them--and behave as though they had never been given.41 Warnings--in and of themselves--are not likely to deter inappropriate access in the long run. The same habituation may not be true, however, of warnings or monitoring associated with a human presence. An adult supervisor who forces a non-routinized interaction with a child has a far better chance of capturing his or her attention. Browser histories log Web sites that have been viewed, though to learn the actual content of these sites, the adult supervisor must either examine these Web sites or make inferences about their content on the basis of the site's URL. Keystroke monitors are equally capable in monitoring Web sites, e-mail, and anything else that requires user input. Monitoring of screens being used by children, if done on a large scale (i.e., many screens being supervised by one person), in practice monitors access to inappropriate imagery. Text also can be monitored remotely, but in this case, the adult supervisor cannot tell at a glance if the text contains is inappropriate material, and thus must spend more time in reading that text to make a judgment. Because monitoring leaves the choice of access up to the child, inadvertent access to inappropriate material is possible. (For this reason, monitoring is arguably less appropriate for children whose decision making capabilities have not matured.) But the child also retains the choice to gain access to information that may be relevant to his or her information needs, and thus the problem of overblocking described in Section does not exist. Judgments about the effectiveness of monitoring are mixed. Monitoring--coupled with punishment--has deterrence effects at least in the short term and at least for some fraction of youth. But because the decision making party is the youth, rather than an adult (e.g., the filter vendor), denial of access to inappropriate material cannot be assured. Moreover, as with filters, a change of venue will often suffice to eliminate overt monitoring. A critical dimension of monitoring is the kind of adult response that is coupled to a detection of inappropriate access or behavior. If an adult offers guidance and explanation rather than punishment (as is most appropriate if a violation is accidental), the youth may learn how to differentiate--for himself or herself--appropriate from inappropriate actions. To the extent that this is true, protection from inappropriate material may be extended to non-monitored venues and for much longer periods of time. (On the other hand, once clear explanations for rules have been provided, punishment for intentional infraction of rules is entirely appropriate for demonstrating that infraction carries consequences.) Another critical dimension is whether monitoring is covert or overt. Covert monitoring, if undiscovered, is more likely to provide information about what the child is doing "when left to his or her own devices." And, if undiscovered, the individual being monitored will not change venues. But covert monitoring--by definition--cannot deter, because the youth in question must be aware that monitoring is happening if it is to have an effect on his or her behavior. Moreover, undertaking monitoring covertly leaves the question of what the responsible adult should do--if anything--in the event that monitoring reveals that the child is behaving inappropriately. If the adult does nothing except watch, learning that is directly coupled to inappropriate access or behavior cannot occur, and the inappropriate behavior may well continue. Yet, if the adult does respond to such a revelation, he or she may be forced to disclose the fact of monitoring, with all of the attendant consequences (e.g., a child who reacts negatively because the adult is changing the rules from what was expected). In principle, an adult could act without disclosing the fact of monitoring--for example, the adult may draw the child into a general discussion of appropriate Internet behavior without reference to any specifics that might be associated with the child's behavior. However, many adults are likely to find it difficult to act on such information without revealing the source. Overt monitoring can deter. But it can also have negative effects, as described below. If monitoring is coupled to explanations and guidance about appropriate and inappropriate behavior, there is some potential that this application can promote the long-term development and internalization of appropriate behavioral norms. But the explanation and guidance are essential. If, as is much more likely in an institutional setting and in many home situations, the primary or exclusive consequence of detection of inappropriate access is punishment, such learning may well not occur. Even more destructive would be punishment resulting from inadvertent access to inappropriate material, as one can easily imagine might be imposed by an adult supervisor who did not believe an assertion by his or her charge that the inappropriate Web page was viewed by accident. Finally, as with filtering, monitoring can be circumvented by a change of venue in which monitoring is not present.
Decision making is shared between adults and youth. It is the responsibility of responsible adults (e.g., parents and school officials) to provide general guidelines about what constitutes inappropriate material or behavior. However, it is the responsibility of the youth to interpret these guidelines. And, it is the interaction between adult and youth that can provide guidance in any particular instance. For those products that identify inappropriate material, the relevant decision makers are the same as those for filtering.
Given the burden imposed on responsible adults when all access is monitored, some monitoring products make a record only when inappropriate material has been accessed. Of course, such a product requires definitions of inappropriate material--and all of the discussion above in Section is relevant to such definitions. Monitoring can also be intermittent. For example, a product may take a "snapshot" of a given computer screen that records its contents at random intervals that average once an hour. In this case, the auditing burden is directly proportional to the frequency of screen capture and the number of screens being monitored. Monitoring software that never records the screen when Web content from an innocuous site is shown further reduces the number of snapshots that need to be reviewed. Real-time display of a child's screen can be performed at any level of resolution desired. In the instance when the supervisor's monitor displays simultaneously "screen shots" of multiple user screens (e.g., all of the monitors in use by a class), each image appears in a smaller "thumbnail" version that makes reading most text on those screens difficult or impossible while at the same time usually enabling the supervisor to determine if an image is being displayed. Moreover, many inappropriate sexually explicit images are easy for humans to recognize even at low resolution and/or smaller size. The product might then offer the supervisor the chance to "zoom in" on this screen shot for closer examination, and perhaps capture the image for documentation purposes. Records of access may be kept or not, and if kept, at different levels of detail. For example, recorded screen images can be kept at high resolution, enabling the reading of words on the screen, or at low resolution, enabling only a general identification of pictures that may be on the screen. Keystrokes can be recorded for differing lengths of time, enabling a supervisor to know in detail what the minor has done. And, some monitoring packages allow the tracking of timing and sequencing of access that would help administrators to distinguish between intentional and unintentional access. Tools for monitoring Web usage in the aggregate (without individual tracking of users) can provide important information on whether usage conforms to acceptable use policies. Such tools would not compromise individual privacy (indeed, individual logins would not even be required), but would provide data on the Web sites children visited and the Internet activities in which children engage. Such data could then be reviewed to determine the extent to which a given audience is in fact conforming to an AUP. If the system is configured to not monitor Web browser windows where the same page has not been on the screen for at least some period of time, perhaps 15 seconds, and to never monitor e-mail where the same incoming item has not been on the screen for at least that period of time, then the young person may not have the anxiety that "the adults will discover I inadvertently saw this material, but won't believe it was unintentional." Youth will know that as long as they quickly determine that the content is inappropriate, and exit that Web site or that e-mail, no record will be established and no adult will know. To the committee's knowledge, current monitoring software does not have this "do not capture within threshold feature," as it opens a potential loophole: a student clicking through a series of pages of inappropriate Web content, as long as each page is on the screen for less than the threshold time.
Financial Costs The primary financial cost of monitoring is the human effort and labor needed to monitor usage. Monitoring records can be extensive, and in the absence of automated tools to flag questionable accesses, the examination of extensive audit records is both tedious and time-consuming. (Recording screen images frequently for a large number of users also consumes hard disk space at a rapid rate.) A second cost in the institutional setting is that the effort needed to manage individual logins is significant. Login names and passwords must be managed, and procedures must be set up to deal with things like children who lose or forget passwords, passwords that become known to other individuals, revocation of passwords and login names, and so on.42 Psychological and Emotional Costs Punishment that is given soon after undesirable acts are initiated is more effective in deterring a repetition of such behavior than is punishment administered long afterward, suggesting that monitoring systems are unlikely to build positive habits in students unless feedback is received more quickly than may be practical in most situations. (Feedback is needed in minutes or hours, whereas review of access logs may take days or weeks. On the other hand, real-time monitoring can provide opportunities for feedback to the child when the offense occurs.) To be effective in deterring undesirable behavior, punishment must be consistent, which suggests that intermittent monitoring, which saves time and energy, will not be conducive to helping students learn to resist the temptation of seeking out inappropriate materials. To be a component of effective discipline also requires that the basis for punishment (or consequences) not be seen as arbitrary authority but rather an explanation for why certain behavior is unacceptable. A second point is that monitoring and privacy can be antithetical. While the desirability of privacy as an element of an appropriate developmental environment is a cultural issue, most of Western society places privacy in high regard, especially for adolescents who are at a developmental stage during which they often seek some separation from their parents. A need for privacy is an essential component of separation as adolescents begin to create their own identity, an identity that includes an understanding of himself or herself as a sexual person.43 There are some personal issues that adults want to keep to themselves because they are embarrassed or have other feelings or behaviors that they do not want to share generally. Many children (especially adolescents) have those same feelings. To deny them that personal freedom by constant electronic monitoring may convey a lack of trust by an adult community that tells them that there is no personal space that belongs to them and them alone. Monitoring can easily be regarded by youth as a violation of privacy and an unwarranted intrusion that demonstrates a lack of trust, and one common unfortunate consequence is that when mistrusted, an individual often proceeds to act in ways that justify that mistrust. Certainly parents do monitor their children's activities, but the balance of how much children and adolescents are watched varies, depending on characteristics such as age, gender, maturity, and parenting practices. In general, a child's need for personal freedom increases as he or she grows older. Furthermore, children who are constantly watched by parents have less opportunity to develop their own internal controls about their own behavior. They have less opportunity to confront the challenges of life that ultimately develop character, for it is that struggle that makes us who we are. Parents cannot always watch their children. It is then that the effectiveness of socialization is put to the test, for it is what children do in the absence of a parent or an adult that tells of their character. A child who has not internalized parental values may well attempt to break the rules whenever an adult is not watching, for the rules are outside, not inside, the child. By contrast, a child who has internalized those rules generally follows them, regardless of whether they are being watched. Sometimes they will fail, but they can also learn from those mistakes. At the same time, the level of privacy that students can expect in school--in using a computer as well as in other aspects of school life--is different from what they can expect at home, and school computer systems are not private systems. The expectation of privacy when students use computers in schools is more limited, as demonstrated by a variety of actions that have been supported in court decisions, including searches of student lockers, backpacks, and so on. Thus, provided that students have been given notice that their use is subject to monitoring, the use of monitoring systems raises fewer privacy concerns. In libraries, privacy expectations have traditionally been fairly high. That is, libraries have traditionally protected the materials used by their patrons, and have even resisted the efforts of law enforcement authorities to investigate such use. Thus, monitoring in a library context--even with explicit notice--may violate such privacy traditions. Note also that technological monitoring has a different psychological relationship to the one being monitored than does in-person oversight. Consider, for example, a student using an Internet access terminal in the school library. In one scenario, a school librarian walks the floor periodically; if she sees something suspicious on the student's screen, she walks over and asks, "What are you doing?" In a second scenario, the screen on the Internet access terminal is displayed on the school librarian's terminal (in her office) for several seconds at random intervals ranging from once every 5 minutes to once every 20 minutes. Five or ten seconds before the image of the screen is transmitted to the librarian's terminal, an unobtrusive but noticeable warning flashes on the student's terminal to indicate that monitoring is about to take place. In addition, the display on the librarian's terminal is blurred so that words cannot be read, but blurred images can appear, and no records of the screen on the terminal are kept. If the school librarian sees something going on that warrants attention, she can leave her office, walk over to the student, and ask, "What are you doing?" For most people, the first scenario feels like responsible supervision. However, reactions to the second scenario are decidedly mixed, despite the fact that the monitoring system described does nothing that the librarian does not do. What accounts for this mixed reaction? One factor is that the person being monitored from the librarian's office would have no particular assurance, beyond the say-so of the librarian, that any of these assertions would be true. However, where the monitoring takes place by walking the library floor, it can be seen that the librarian is not talking photographs of the library patrons or their screens. But more importantly, the fact of technological monitoring is also likely to change the nature of the relationship between school librarian and student. In the first scenario, the interaction between librarian and student is a human one, and the events have unfolded for the reasons that one would expect--a chance encounter that leads to an open question. But in the second scenario, the school librarian approaches the student with apparent foreknowledge of what the student is doing, and the question is more disingenuous than open. An additional point is that the foreknowledge provided by the monitoring system invites the librarian to jump to conclusions about what the student is doing, and she will be less likely to give the student the benefit of the doubt when she does engage him. Under such circumstances, a useful educational experience is less likely to occur than if she approached the user with an open mind. Infrastructure While filters can be installed on the client side without the cooperation of any other party, real-time monitoring requires a mechanism for displaying the contents of one monitor on another. When tools for monitoring are used on a large scale, a sufficient number of responsible adults is necessary to provide in-person intervention. (How many adults are required depends on how thorough the monitoring is.) An alternative is after-the-fact review of behavior or actions, a process that requires storage of logs, screen snapshots, and so on, and automated tools to flag suspect behavior. The administrative burdens can be sharply reduced if the records reflect only potentially suspect accesses and exposures rather than all Internet use.
As noted above, one major difficulty with monitoring is the effort needed to review audit trails. Thus, there is a role for automated tools that can review audit trails to identify patterns of behavior that are worth further investigation. For example, tools could be available that:
Some of the functionality described above is available in some monitoring products today, but it is far from common.
In general, our society subjects criminals to a high degree of monitoring because they have proven untrustworthy. For the most part, individuals follow societal rules, not through constant monitoring and the invasion of privacy, but because they have learned to internalize the values underlying those rules. Put another way, if laws were followed only because of police monitoring and enforcement, then we would need as many police as other people to maintain law and order. Children make mistakes, and criminals make mistakes, but to be a child is not to be a criminal. Nevertheless, active supervision of children is often appropriate--not because they are criminals but because it is the responsibility of adults to teach them how to internalize the appropriate values and to become better at avoiding inappropriate behavior as they mature. For example, responsible parenting always entails monitoring children in inverse proportion to their capability and maturity, something only a parent can ultimately determine. But as noted in Section , the wise parent couples monitoring with education and discussion in support of helping the child internalize the parents' values, an outcome that will help him or her (or them) to behave appropriately whether parents are watching them or not. Parents have rights to what might be called the "imposition of sanctions," which, like any other part of parenting, fails if used alone.44 As always, the density of loving wisdom in a parent's actions is everything, and it is the nature of the existing parent-child relationship, the intent of the monitoring process, and how it is carried out that counts in understanding its ultimate implications. Is the monitoring of children and adolescents a step in the erosion of privacy for all citizens? Many people believe that it is, and point to other measures that they find equally disturbing--employers who track the behavior of their employees, and commercial online sites that track the behavior and clicks of those who pass through them. The monitoring of children raises special concerns because of the fear that a younger generation will grow up never having known a world in which they had rights to privacy, and thus never realizing what rights that might have lost. Others argue that such fears are overplayed, pointing to social and commercial benefits of increased customization of information delivery and an assumed lack of government interest in the affairs of ordinary people, as well as the fact that that schools are expected to act in loco parentis with respect to the students in their care. Indeed, some believe that it would be a positive development in society of adults in all venues felt some responsibility for looking after the welfare of children and for supervising children when they are in a position to do so.45 Resolving this question is beyond the scope of this study, but noting the question raised by monitoring of children is certainly not.46
1. Monitoring that warns when exposure to inappropriate material may occur is an alternative to filtering and eliminates the problem of overblocking associated with filtering. 2. Overt monitoring in concert with explicit discussion and education may help children develop their own sense of what is or is not appropriate behavior. Monitoring coupled primarily with punishment is much less likely to instill in children such an internal sense. In general, the simple presence of monitoring equipment and capabilities (or even the assertion of such capabilities) may create a change in behavior, though the change in behavior is likely to be restricted to the situation in which monitoring occurs. 3. Because human intervention is required on a continuing basis, monitoring is more resource-intensive than filtering. For the same reason, monitoring is more likely to be construed as a violation of privacy than are other techniques that simply block access. 4. Covert monitoring leads to an entirely different psychological dynamic between responsible adult and child than does overt monitoring. (Furthermore, because people habituate to warnings, children may respond to overt monitoring as though it were covert--i.e., more negatively.)
"Spam," e-mail that is similar to the "junk mail" that an individual receives through the post office in the brick and mortar world, is sent--unsolicited and indiscriminately--to anyone with a known e-mail address. E-mail addresses can be purchased in bulk, just as regular mailing lists can be purchased: a typical rate for buying e-mail addresses is 5 million addresses for $50. Alternatively, e-mail addresses can be found by an e-mail address "harvester." (See for more details.) Spam refers to any form of unsolicited e-mail a person might receive, some of which might be sent by publishers of adult-content Web sites. A typical spam message with sexual content would contain some "come-on" words and a link to an adult-oriented Web site, but would in general arrive without images. Policy issues associated with spam are addressed in Chapter 10.
Technologies for controlling spam fall into two categories--tools that seek to conceal the e-mail address (because if an e-mail address is not known, spam cannot be sent to it) and tools that manage spam once it has been received. Whether an individual can implement such tools varies with the ISP and/or e-mail service used. To conceal e-mail addresses with certain ISPs, one can create different login names. For example, online services such as AOL enable a user to create more than one login name that can serve as an e-mail address. An individual can thus use this special login name for activities that might result in spam (e.g., for participating in a chat room). This special name becomes the attractor for spam, and mail received at that address can be deleted at will or even refused. A number of services (some free, some not) automate this process, enabling users to create special-purpose addresses that can be turned off or discarded at will. In addition, e-mail systems may allow a user to disallow all e-mail except e-mail from a specified list of preferred addresses and/or domain names. To manage spam that does reach the user's mailbox, a number of tools are available. Most of these tools depend on the ISP or the use of an e-mail program with filtering capabilities (e.g., Eudora, Netscape Messenger, Microsoft Outlook). Spam e-mail can be identified and blocked on the basis of:
Users can also take a number of procedural measures. For example, Web sites often ask for information from the user. By inserting false information (e.g., indicating an income lower than is actually true), it is sometimes possible to avoid marketing attacks based on demographic information consistent with the group being targeted by the marketers. Internet service providers also take measures to limit spam. For example, AOL limits how fast other users can repeatedly enter and exit chat rooms, because a pattern of repeatedly and rapidly entering and exiting chat rooms can also be an indication that someone is harvesting e-mail addresses. Most ISPs also have lists of known spammers from which they refuse to carry traffic.
Spam-control technologies for dealing with e-mail that has arrived in one's mailbox suffer from the same underblocking and overblocking issues that are discussed in Section . One important issue is that spam often contains links to inappropriate sexually explicit material rather than the actual material itself, and no content-screening spam-controlling tool known to the committee scans the content for links that may be embedded in an e-mail. That said, some spam-controlling technologies are highly effective against spammers. Those that restrict the e-mail that can be received to a given set of senders are very effective (i.e., do not accept mail unless the sender is on a list of permissible senders or from specific domains). On the other hand, they also sharply restrict the universe of potential contacts, so much so that a user may fail to receive desired e-mail. (For example, a friend who changes his or her sending e-mail address will not be able to reach someone who has identified a white list of preferred senders.) ISP-based or e-mail-service-based spam filters are partially effective. For example, the researcher mentioned in found that the spam filter on one popular free e-mail service reduced the volume of spam by about 60 percent, though it still passed more than one message per day. Spam filters that are based on content analysis techniques have all of the problems with false positive and false negatives that Web filters have.
Some spam filters have preconfigured lists of known spammers. But in general, it is the user who must decide what is spam and what is not. Of course, the difficulty--especially for a child--is to recognize spam without opening the e-mail. In some cases, it is easy to recognize from the header or subject line. But many spam messages reveal themselves only when they are opened. (Note also that one person's spam is another person's service or information. Unsolicited notices for ski vacations or material on a local political candidate may be useful to some individuals and useless to others.)
Because many ISPs filter out spam for most users, users of those ISPs need not take any action at all to reject spam. However, when spam leaks through the ISP filters (or if e-mail is not filtered for spam at all), as is typical of most e-mail, the user must take action. Note that unsolicited e-mail, and the resources and attention it consumes, is not limited to sexually explicit e-mail for youth. It would be reasonable to assume that the number of parties sending unsolicited e-mail, the frequency with which they send it, and the volume that they send will all increase. Therefore, approaches to this problem are likely to be developed, regardless of the concerns about youth and sexually explicit material. However, this can easily turn into another race: as better spam-discriminating technologies are invented, alternative ways of wrapping the unsolicited e-mail are invented, and the cycle continues.
Spam can be controlled in a variety of locations. When control is located at the receiver, locally installed software spam filters can help to process and eliminate spam. Conceptually, the cost of local spam filters is similar to that for content filters. However, spam filters must be integrated into software for processing e-mail in general. Control points based in the network (or an ISP) are both more complex and more comprehensive. Some ISPs have extensive capabilities for detecting the sending of spam mail (e.g., by monitoring the volume of e-mail sent in a given time interval), preventing the "harvesting" of e-mail addresses, and so on, and developing these capabilities entails substantial effort. Individual organizations often incur cost in configuring software and servers to stop spam from going through their own internal networks. Such efforts are often undertaken to help manage internal bandwidth more effectively. Finally, there may be costs incurred for infrastructure that may be needed to support legislative efforts to curb spam. For example, one method for dealing with junk mail and phone telemarketers is to establish a clearinghouse where people can register their names, addresses, and phone numbers. But the effectiveness of this approach is based on the fact that it is in the marketer's self-interest to refrain from wasting phone and mail effort and time on people unlikely to buy. Because sending spam is so much cheaper than mail and phone calls, a similar approach is unlikely to work effectively without some kind of legal cause of action that can be taken against those who ignore the clearinghouse. (Policy-based solutions are discussed in Chapter 9.)
There has been an acceleration of commercial organizations introducing their messages into schools, although almost always after signing an agreement with the school board (which agreement usually includes new funds flowing to the school to supplement the budget). However, schools may wish to install some mail filtering before the marketing department of some soft-drink manufacturer decides to send e-mail to students just before lunch, promoting its product while also, to prevent uproar, giving "the spelling word of the day," "the math hint of the day," or whatever. It is easier for the school district to add another item to the spam filter than to have its lawyer sue the sender of the e-mails. As in the case of age verification technologies, expanded use of "mail deflection" beyond issues of sexually inappropriate material may warrant the trouble of installing spam-controlling systems.
As described in Chapter 9, legislative efforts to curb spam do have societal implications.
1. Spam-controlling technologies generally do not allow differentiation between different kinds of spam (e.g., hate speech versus inappropriate sexually explicit material). Rather, they seek to identify spam of any nature. 2. Spam-controlling technologies that filter objectionable e-mail have more or less the same screening properties that filters have. That is, they do block some amount of objectionable content (though they do not generally screen for links, which are often transmitted in lieu of actual explicit content). However, they are likely to be somewhat less effective than filters at preventing such e-mail from being passed to the user because users are likely to be more apprehensive about losing e-mail that is directed toward them than about missing useful Web sites, and thus would be more concerned about false positives. 3. Behavioral and procedural approaches to avoiding spam (rather than filtering it) have at least as much potential as spam-controlling technologies to reduce the effect of spam. However, using such approaches adds somewhat to the inconveniences associated with Internet use.
The technologies discussed in Sections and are intended to prevent the exposure of children to inappropriate material. Instant help is a tool to deal with exposure after the fact.
The philosophy underlying instant help is that from time to time children will inevitably encounter upsetting things online--inappropriate material, spam mail containing links to inappropriate sexually explicit material, sexual solicitations, and so on. When something upsetting happens, it would be helpful for a responsible adult to be able to respond promptly. An "instant help" function would enable the minor to alert such an adult and appropriate action could then ensue, could provide another channel to law enforcement through which threats, and solicitations, and obscene materials or child pornography could be reported. To the best of the committee's knowledge, there are no commercially available tools that provide instant help. But current technology could easily support an instant help function. For example, a secure one-click call for help47 could be:
These buttons might be as simple as an icon for the CyberTipline (CTL) that would serve as an easily accessible channel for the public to use in reporting child pornography. The CTL icon has proven to be an effective tool in reporting obscene or child pornography because it is user-friendly and is the most direct method to report such images to the appropriate law enforcement authority. Because the CTL icon was built for the sole purpose of interfacing with the public to facilitate reporting to law enforcement computer-assisted crimes against children, it is more effective than other mechanisms for such reporting. Depending on the context of the technology through which the user is coming into contact with inappropriate content or interactions, a wide range of functionality is possible once the button is clicked. For example, "instant help" on a browser or an ISP could immediately connect the user to a helper who provides assistance. To provide context, an image of the screen could be transmitted to the helper. Such assistance might be most useful if the user encounters a solicitor or inappropriate conversation in a chat room or an instant message. Or, if a user encounters inappropriate material, the last several Web pages viewed could be shared with the helper, who could assist the user in whatever action he or she wished to take (e.g., sending URLs to the CyberTipline). For Internet access on a LAN, instant help could be configured to summon assistance from a responsible adult within the LAN, such as a teacher or a librarian. Instant help would be applicable in both home and institutional contexts. Implementing instant help functionality must be undertaken by service providers and technology vendors. But such functionality is not helpful without a human infrastructure to assist those seeking help--the human infrastructure may be provided by the ISP, a parent, a school, a library, or even an expanded National Center for Missing and Exploited Children (NCMEC).
By providing assistance to the minor, instant help could potentially reduce the negative impact that may result from exposure to inappropriate material or experiences. Such exposure can come from either deliberate or inadvertent access, but in practice instant help is more likely to be useful in the case of inadvertent access. Instant help obviously does not enable the user to avoid inappropriate material but does provide a means for trying to cope with it. It also provides opportunities to educate children about how to avoid inappropriate material or experiences in the future, and it might lead to the creation of more civil norms online. It provides immediate assistance in the case of aggressive solicitations and harassment. Finally, it might lead to greater law enforcement activity if the materials involved are obscene or constitute child pornography. Metrics of effectiveness that indicate the extent to which children are not exposed to inappropriate materials do not apply to instant help. Rather, effectiveness is better assessed on the basis of the quality of the assistance that helpers can provide, and the responsiveness of the instant help function. Assistance that arrives 20 minutes after the user has pressed the instant help button is obviously much less helpful than if it arrives in 5 seconds, and of course, human helpers must be trained to handle a wide variety of situations. A specialist trained to provide this kind of help to youth, or a peer with special training, could potentially be more effective than the child's own parent or teacher or librarian. However, because this approach has never been implemented on a wide scale, staffing needs for instant help centers are difficult to assess. In many urban areas, crisis intervention hotlines (focused on helping people subject to domestic abuse, or feeling suicidal, struggling with substance abuse addictions, and so on) exist, but there are none known to the committee that give training to their volunteer staffs concerning children's exposure to sexually explicit material on the Internet.
Unlike other tools, the locus of decision making in the context of instant help rests with the minor. The minor decides what is upsetting and determines the situations in which he or she needs help.
The purpose of an instant help function is to ensure that something can be done with very little difficulty. Thus, the flexibility and usability of an instant help function are paramount. For example, individual parents, libraries, or schools could customize who is contacted when the instant help button is pressed. Thus, a family with strong religious ties could set instant help to alert helpers from a group associated with their religious tradition, while a school district could set the instant help button so that in the elementary school, a message went to a staffer in that building and in the middle school, to a staffer in the middle school building. This is in some sense analogous to the national phone emergency number 911 going to a local 911 dispatch center based on the exchange from which 911 was dialed.
The infrastructure and institutional cooperation needed for instant help to work successfully are considerable. Vendors must be willing to use precious screen space to provide instant help buttons. The infrastructure of helpers must be developed and deployed. For an ISP, such an infrastructure might well be expensive; for the NCMEC or law enforcement agencies, it would be very expensive. But for a school or library (or even for responsible adult guardians), the infrastructure of helpers may already be in place.48 The costs are roughly proportional to the size of the helper infrastructure; helpers (who could be volunteers) must be trained in how to respond to a call for help. Note also that a skilled adult predator or even adolescents bent on mischief could create a flood of diversionary instant help requests so that the responding individuals would become backlogged, during which time the predator could attempt to continue an interaction with a young person. Thus, some mechanism for protection from "flooding attacks" would be needed by any responding center that serves a large number of anonymous end users or devices.
To the committee's knowledge, instant help functionality has not been implemented anywhere, and it remains to be seen if children would actually use it if and when they are confronted with inappropriate material or experiences that upset them. Thus, some small-scale piloting of the concept to evaluate how it might work is likely to be very helpful before any major effort to implement instant help is considered.
A potential downside of a "low-cost" implementation that would require the child to describe the material and how he or she got there is that the child might be forced to focus more on the inappropriate material, perhaps causing at least discomfort to the child who may be better off if transferred back to appropriate activities as soon as possible. Such a negative outcome could be avoided if the inappropriate material could be automatically transmitted to the helper. (Since the material may well not be present on the child's screen when he or she contacts the helper, the automatic display of material might have to retrieve the last several screens--this may be a difficult technical task under some circumstances.) In cases where a new type of offensive material or communication begins to occur for the first time on the Internet, the first instant help response center to identify this new material could share that information with schools and parents, other instant help response centers, youth (as warnings), or even filtering vendors. In that sense, instant help might harness all youth who use it to improve the monitoring of the Internet for new offensive material or communication. Dissemination of the insights of the staff of an instant help center should be considered a networked response, as opposed to the response of assisting a child when requested. The Internet technical community has experience with Networked Response in the CERT system to propagate information about worms, security holes, and the like.
As the committee is unaware of any implementation of instant help that fits the description above, there are no findings to report. NOTE: The "end user" is generally the adult supervisor who makes decisions on behalf of a child. (This is true in all cases except for instant help, in which the user is the child seeking help.)
Boxes
Table
Notes1 The actual content of a list of such keyword is usually held as proprietary information by the vendor of the filter. However, such lists include a variety of "four-letter words" associated with sex, reproduction, and excretory functions, word such as "sex," "naked," and so on. Other words that might be singled out for inclusion include "bomb," "bondage," "fetish," "spunk," "voyeurism," "babe," "erotica," "gay rights," "Nazi," "pot," "white power," "girlz," and "hard-core pornography." (These examples are taken from Lisa Guernsey, 1999, "Sticks and Stones Can Hurt, But Bad Words Pay," New York Times, April 9.) Also, a more sophisticated form of filtering is based on the analysis of word combinations and phrases, proximity of certain keywords to certain other keywords, the presence of various URLs, and so on. In some cases, text-based analysis may also be combined with the analysis of images on the Web page in question. As a rule, tools based on this more sophisticated filtering are not broadly marketed today. 2 Note that sources on a white list can be specified in advance or identified as appropriate because a source contains one or several "good words" that may be found on a "good-word" list. For an example of the latter, see Gio Wiederhold, Michel Bilello, Vatsala Sarathy, and XioaLei Qian, "A Security Mediator for Health Care Information," in pp. 120-124 Proceedings of the 1996 American Medical Informatics Association Conference, Washington, D.C., October. 3 The distinction between server-side filtering and content-limited ISPs is not a technical one, because content-limited ISPs use server-side filters. Rather, the point is that server-side filtering provides a degree of institutional customization that is not possible with content-limited ISPs, which tend to offer one-size-fits-all filtering policies. 4 The use of server-side filters may degrade performance. In particular, a server-based filter may rely on "proxy servers" that are unable to take advantage of the caching techniques that are often used by major Internet providers to speed the retrieval of commonly requested pages. Such a filter would be forced to retrieve information from its host server and take whatever performance hit that might entail. In other cases, performance is improved because without irrelevant material taking up space in the cache, retrieval of relevant material is faster. 5 In practice, a responsible adult would set the filtering provision to the "on" setting, and save the configuration. Thereafter, other requests on that client to the search engine would encounter the "on" setting. The setting can also be turned "off" through entering a password known only to the individual who initially set it (a possible problem if that person is the teenager in the household who manages the family's information technology). 6 Note that content that is transmitted through certain channels such as attachments to e-mail, videoconferences, instant messages, or peer-to-peer networking (in a Gnutella-like arrangement) is very difficult (arguably impossible) to block selectively, though a filter can block all interaction through these channels. Moreover, to the extent that the content of traffic is determined interactively, neither labeling nor sites are likely to provide a sufficient basis. The reason is that interactive sources, almost by definition, can support a variety of different types of interaction--the best example of which is an online friend with whom one may exchange sports trivia, conversation about school homework, and inappropriate sexually explicit material. Only real-time content recognition has a chance of filtering such content. 7 Note also that the list of blocked sites often include sites that could help users circumvent the basic filtering. Thus, sites providing information on how to circumvent filters are often included on the list, and a number of filters block sites that allow language translation (Seth Finkelstein and Lee Tien, 2001, Blacklisting Bytes, white paper submitted to the Committee, available from <http://www.eff.org/Censorship/Censorware/20010306_eff_nrc_paper1.html>) or access to Web archives (Seth Finkelstein, 2002. The Pre-Slipped Slope - censorware vs the Wayback Machine web archive, available from <http://sethf.com/anticensorware/general/slip.php>.) 8 According to the National Center for Education Statistics, nearly three-fourths of all schools use blocking or filtering software. See A. Cattagni and E. Farris. 2001. Internet Access in U.S. Public Schools and Classrooms: 1994-2000. NCES 2001-071. Office of Educational Research and Improvement, U.S. Department of Education, Washington, D.C. Available online at <http://www.nces.ed.gov/pubs2001/internetaccess/>. 9 By contrast, around 57 percent of public libraries do not filter Internet access on any workstation, while about 21 percent filter access on some workstations. About 21percent filter all workstations. See Norman Oder. 2002. "The New Wariness," The Library Journal, January 15. Available online at <http://libraryjournal.reviewsnews.com/index.asp?layout=article&articleid=CA188739>. 10 A survey conducted by Family PC magazine in August 2001 found that of 600 families surveyed, 26 percent used parental controls of some kind. About 7 percent of those using parental controls (about 1.8 percent of the total) used off-the-shelf store-bought filtering packages. The rest used filtering offered by an Internet service provider. (This study is not available in print, because it was scheduled for publication in October 2001, and Ziff Davis, the publisher of Family PC, terminated the magazine before that issue was printed.) 11 For example, a survey taken by the American Management Association in 2001 found that 38 percent of the firms responding do use blocking software to prevent Internet connections to unauthorized or inappropriate sites. Seventy eight percent of the responding firms restricted access to "adult" sites with explicit sexual content, though it is not clear how the remaining 40% are enforcing such restrictions. (The survey suggests that they are doing it by active-monitoring Internet use.) See American Management Association, 2001. 2001 AMA Survey, Workplace Monitoring and Surveillance: Policies and Practices. Available online at <http://www.amanet.org/research/pdfs/emsfu_short.pdf>. 12 Potential overlap between the business market and the school and library filtering market raises the following operational concern: a blocked category may be defined by a vendor so that it is appropriate in a business environment, but that definition may not be appropriate in a school or library context. For example, information about sexually transmitted diseases, safe sex practices, and pregnancy may not be necessary in most business environments (and hence an employer may have a legitimate business reason for blocking such information), but many would argue that older students using school facilities should not be blocked from receiving such information. 13 Note also that legal challenges brought against the mandated use of filters in institutional settings have relied significantly on the existence of underblocking and overblocking as inherent flaws in the technology that make filters unsuitable for such use. 14 For "bake-offs" comparing Internet filters, see Christopher D. Hunter, 2000. "Internet Filter Effectiveness: Testing Over and Underinclusive Blocking Decisions of Four Popular Filters." Social Science Computer Review, Vol. 18. No. 2. Summer. (Available at <http://www.copacommission.org/papers/filter_effect.pdf>); Karen J. Bannan, 2001. "Clean It Up," PC Magazine, September 25, available online at <http://www.pcmag.com/article/0,2997,a%253D12392,00.asp>; "Digital chaperones for kids," Consumer Reports, March 2001. For a critique of the Consumer Reports analysis, see David Burt, 2001, "Filtering advocate responds to Consumer Reports article," February 14, available online at <http://www.politechbot.com/p-01734.html>. 15 Two particularly egregious examples include Beaver College and online biographies of individuals who have graduated magna cum laude. Beaver College in Pennsylvania recently changed its name to Arcadia College because its name was being filtered out ("beaver" has crude sexual connotations in American English slang). Beaver College spokesman Bill Avington was quoted in Wired as saying, "We have a lot of evidence that people aren't able to get our information in high schools because of Web filters in the libraries" that block out sites with "Beaver" along with other presumed smut words. He continued, "With so many people using the Net as the initial means to look at colleges, that's a serious disadvantage." In addition, he claimed that filters sometimes block e-mail from Beaver college staffers to prospective students. (See Craig Bicknell, 2000, "Beaver College Not a Filter Fave," Wired, March 22, available online at <http://www.wired.com/news/politics/0,1283,35091,00.html>; and CNN story, 2000, "Beaver College Changes Oft-derided Name to Arcadia University," November 20, available online at <http://www.cnn.com/2000/US/11/20/embarrassingbeaver.ap/>. The "magna cum laude" problem was demonstrated when filtering software blocked access to all biographies of COPA Commissioners who had graduated magna cum laude (see <http://www.cdt.org/speech/filtering/001002analysis.shtml>). 16 The magnitude of overblocking due to IP-based virtual hosting is unclear. One estimate (Art Wolinksy, 2001, "FilterGate, or Knowing What We're Walling In or Walling Out," MultiMedia Schools, May/June, available online from <http://www.infotoday.com/mmschools/may01/wolinsky.htm>) suggests that such overblocking far outstrips overblocking for other causes. However, number of factors should be considered in assessing the potential impact of IP-based virtual hosting: Most large sites are not hosted on virtual hosting services. Furthermore, large sites tend to be more heavily promoted and are often more likely to appear in a prominent position in a search engine's result list. Thus, large sites--which typically account for the Web requests of most users--are much less likely to be improperly blocked than smaller sites. Many virtual hosting services ban adult-oriented, sexually explicit material and other material that they regard as offensive as well, and they enforce their acceptable use policies vigorously. Thus, the amount of sexually explicit material hosted overall by such services is likely to be small. (But, if such a service does host even one site containing inappropriate sexually explicit material and that fact is picked up by a filtering vendor that uses IP-based filtering, it will exclude all of the acceptable sites on that host. All of the acceptable sites that are improperly blocked will stay blocked until the hosting service eliminates the inappropriate site and the fact of elimination is communicated to the vendor.) Different implementations of filtering (e.g., use of name-based filtering) can lead to the same intended result without the overblocking caused by IP-based filtering. As a rule, the primary reason for wishing to use IP-based filtering is technical--when a hosting service is used primarily for adult-oriented, sexually explicit material, IP-based filtering reduces the amount of storage and processing needed by the filter. 17 The distinction between overblocking and an overly broad scope of blocking is further complicated by the fact that from time to time, a given site can be used for multiple purposes. Most filters include adult-oriented Web sites in their "to be blocked" categories. However, a high school student undertaking, for example, a study of the economics of the adult online industry might have an entirely legitimate purpose for seeking access to such sites. More generally, any student wanting to study a controversial issue and needing to consult sources for different sides of an argument may well find that advocates of one point of view or another are blocked because they are regarded as "inappropriate"--where, in practice, "inappropriate" is likely to mean "controversial." 18 See Footnote . 19 According to Grunwald Associates, 17.7 million children aged 2 to 17 had Internet access from home in the last quarter of 1999. (The Web site <http://cyberatlas.internet.com/big_picture/demographics/article/0,,5901_390941,00.html> provides a summary of the Grunwald study. The full study is available online at <http://www.grunwald.com/survey/index.htm>.) The U.S. Census indicates about 65.7 million children in the United States in this age bracket, for a percentage of about 27 percent. 20 Many filtering products, especially those on the client side, are easily breakable by knowledgeable users. See Michael J. Miller, 2001. "When Does Web Filtering Makes Sense?," PC Magazine, September 25, available online at <http://www.pcmag.com/article/0,2997,s%253D1499%2526a%253D12632,00.asp>. 21 A proxy server is a server that happens to be accessible from the client machine. The use of a proxy server, which can channel all requests "around" a server-side filter, can enable circumvention. Many local area networks, however, are configured in such as way as to prevent the use of proxy servers. 22 U.S. public schools are increasingly providing Internet access to students outside regular school hours. For example, 80 percent of public secondary schools provided such a service in 2000. In addition, schools with high minority enrollments provided Internet availability outside of regular school hours more frequently than schools with lower minority enrollments (61 percent versus 46 percent), a figure consistent with the notion that minority students may rely on schools to provide access more than do non-minority students. See A. Cattagni and E. Farris. 2001. Internet access in U.S. public schools and classrooms: 1994-2000. NCES 2001-071. Office of Educational Research and Improvement, U.S. Department of Education, Washington, D.C. Available online at <http://www.nces.ed.gov/pubs2001/internetaccess/>. 23 Indeed, in one community, the public library system provided filters for 10 to 20 percent of its Internet access points but made no special attempt to guide children toward these filtered workstations. Nevertheless, the presence of these filters on 10 to 20 percent of its workstations was sufficient to allow it to assert to the community that "the library provides filtered access," an assertion that seems to have met the concerns of local government. 24 In the site visits of the committee, committee members explicitly avoided leading questions regarding the motivation for use. So, when teachers said "our school has filters" (which was true in all schools visited), committee members asked "why do you have them?" "What is the benefit having filters?" It is in this context that teachers said "to reduce exposure to liability." For the most part, the committee believes that given the overall context of all of the comments received in this manner (e.g., the accessibility of the Internet in unfiltered non-school venues for a large number of students), the avoidance of liability was indeed a primary or at least a very important reason for having filters in schools. Nevertheless, the committee recognizes the possibility that responders felt the protection benefits were so obvious so as not to need mentioning. 25 Of course, a filter is not the only way to solve this particular problem--it would be almost as effective to install software that would clear the browser cache and return to the library's home page after a short period of inactivity. 26 In a preliminary finding issued in May 2001, the Equal Employment Opportunity Commission found that pornography downloaded on library computers was intended to create a sexually hostile work environment for a group of Minneapolis librarians. See Michael Bartlett, 2001, "No Internet Filtering Is Sex Harassment for Librarians--EEOC," Newsbytes, 25 May. Available online at http://www.newsbytes.com/news/01/166171.html. 27 Victoria Rideout, 2001. Generation Rx.com: How Young People Use the Internet for Health Information, The Henry J. Kaiser Family Foundation, Menlo Park, Calif., available online at <http://www.kff.org/content/2001/20011211a/GenerationRx.pdf>. 28 One concern raised by analysts such as Nancy Willard is that filter vendors sometimes have strong connections to religious organizations, and that the social and cultural values espoused by these organizations may drive the vendor's characterization of inappropriate content. For example, Willard finds that "most of the companies have filtering categories in which they are blocking web sites....known to be of concern to people with conservative religious values--such as [Web sites involving] non-traditional religions and sexual orientation--in the same category as material that no responsible adult would consider appropriate for young people." She also notes that "because filtering software companies protect the actual list of blocked sites, searching and blocking key words, blocking criteria, and blocking processes as confidential, proprietary trade secret information it is not possible to prove or disprove the hypothesis that the companies may be blocking access to material based on religious bias." At the same time, Willard finds that while "information about the religious connections can be found through diligent search, such information is not clearly evident on the corporate web site or in materials that would provide the source of information for local school officials," though she acknowledges openly that "it is entirely appropriate for conservative religious parents or schools to decide to use the services of an ISP that is blocking sites based on conservative religious values. It is equally appropriate for parents to want their children to use the Internet in school in a manner that is in accord with their personal family values." See Nancy Willard, 2002, Filtering Software: The Religious Connection, Center for Advanced Technology in Education, College of Education, University of Oregon, available online at <http://netizen.uoregon.edu/documents/religious2.html>. 29 As with so many other "null" observations, the absence of complaints about overblocking can be interpreted in many ways. One interpretation is, of course, that overblocking simply does not occur very much (and/or that filters do not block a great deal of useful and appropriate information). But information collected through site visits is not consistent with this interpretation, and testimony to the committee suggests other explanations as well. For example, the relative lack of complaints may be partly explained by the fact that filters for institutional use are increasingly flexible (see Section ). If blocked pages that are needed for educational purposes, for example, can be obtained quickly (e.g., in a matter of minutes), the issue of overblocking need not be as salient. (One school system told the committee that a filter used previously had not allowed such flexibility and had resulted in a large number of complaints from teachers and students. Other faculty and librarians in other locations told the committee that unblocking sites was cumbersome and difficult.) A second reason for the lack of complaints is likely to be the fact that once a filter is in place, the expectation of users is that access will be filtered. The committee heard stories of a number of complaints regarding filtering when filters were first installed, but in most such instances, the complaints ceased after a few months. Students without home Internet access seemed to accept a school's filtering policy as a given, and simply adapted to it, even if they were prevented from accessing valuable information. One librarian told the committee that a blocked Web page was analogous to a book that was not present in the library, and that the alternative approaches to obtaining the information were similar to using interlibrary loan. Students with Internet access at home have no particular reason or incentive to complain aside from issues of efficiency or convenience. 30 In general, when students encountered blocked sites at school, they simply went to another venue to reach those sites--most of the students to whom the committee spoke had unfiltered home access. 31 In general, they protect the list by encrypting it and had hoped that the Digital Millennium Copyright Act (DMCA) would outlaw reverse engineering to decrypt such lists. However, on October 27, 2000, the U.S. Copyright Office issued its final rule implementing the anti-circumvention provisions of the DMCA. The statutory provisions of DMCA prohibit the circumvention of technical measures that prevent the unauthorized copying, transmission, or accessing of copyrighted works, subject to this rulemaking of the Copyright Office. The final rule establishes two exceptions to the anti-circumvention provisions, one of which allows users of Internet content filtering programs to view lists of Web sites blocked by such software. The Copyright Office recognized a First Amendment interest in access to this information and stated the need for circumvention in this instance "since persons who wish to criticize and comment on them cannot ascertain which sites are contained in the lists unless they circumvent." This exception to the DMCA rule may have an impact on the ongoing public debate about filters. In March 2000, two programmers who revealed the list of thousands of Web sites blocked by the Internet filtering program CyberPatrol faced charges of copyright violation. 32 A study by the Kaiser Family Foundation found that among teenagers aged 15-17 who have sought health information online, 46 percent reported that they experienced blocking from sites that they believed were non-pornographic. For example, 15 percent of those who were blocked reported that they were searching for information on sexual health topics. See Rideout, 2001. Generation Rx.com: How Young People Use the Internet for Health Information. 33 Because of keyword filtering, sites containing certain keywords may be blocked. However, synonyms to these keywords may not be filtered, and sites with these synonyms will not be blocked. 34 It is interesting to note that industry labeling initiatives in other media have been more successful and widely accepted and used; these other media include movies (through the MPAA), TV (through a joint effort of the Motion Picture Association of America, the National Association of Broadcasters, and the National Cable Television Association), and software CD-ROMs and games (through the Interactive Games Developers Association). One reason for this success is that the volume of content produced in these media is much smaller than the volume of content produced for the Web. 35 Of course, entrepreneurs in other areas are also struggling to find viable long-term Internet business models. 36 In the future, it may be possible that other kinds of content (e.g., sound files associated with sexually explicit content) will be regarded as being as objectionable as images. (Recall that "dial-a-porn" services had some appeal for adolescent youth, and that the availability of such services to minors created significant controversy in the early 1990s.) If that future comes to pass, the media containing such particularly objectionable content might also be selectively blocked (e.g., by blocking all sound files on sexually explicit Web pages). 37 The first widespread instance of such blocking occurred in 1995 when a major online service provider blocked all sites containing the word "breast," including those dealing with breast cancer. In the wake of widespread complaints, the service provider in question quickly restored access to breast cancer sites. Since then, this particular problem has occurred only rarely, as a number of techniques described in Section 2.3.1 can be used to avoid problems arising from simple-minded keyword matching. Still, the problem has not been eliminated entirely, and a recent instance of a Web site involving breast cancer being blocked was brought to the committee's attention in January 2002 (personal communication, Bennett Haselton, Peacefire.org). In this instance, the reason for such blocking apparently arises from the use of IP-based virtual hosting. 38 John Schwartz, 2001. "Schools Get Tool to Track Students' Use of Internet," New York Times, May 21. 39 Note also that individual logins are a necessary though far from sufficient aspect of maintaining computer/network/system security. In the absence of individual logins, it is essentially impossible to hold any specific individual responsible for actions that might compromise security. For more discussion, see, for example, For the Record, Computers at Risk. Thus, there are advantages for institutions to consider individual logins for reasons entirely apart from protecting their youth from inappropriate material. 40 See, for example, Associated Press, 2002. IM monitoring grows in popularity, April 12. Available online from <http://www.msnbc.com/news/737956.asp?0si=-#BODY>. 41 See Computer Science and Telecommunications Board, National Research Council, 1996, Cryptography's Role in Securing the Information Society, National Academy Press, Washington, D.C. 42 For more discussion, see Computers at Risk, For the Record. 43 According to a survey by the Kaiser Family Foundation in 2001, teenagers place a high value on privacy with respect to their Internet usage. 76 percent of online youth agree that "looking up information online is good because I can look things up without anybody knowing about it." Where looking for health information is concerned, 82 percent percent say that confidentiality is very important. A sizable minority of young people are concerned about the privacy of their online searches for information, with 40 percent saying they are worried that the computer might keep track of what they do online. See Rideout, 2001. Generation Rx.com: How Young People Use the Internet for Health Information. 44 Even these rights have has limits. Parents, for example, cannot subject their children to abuse in the name of discipline. 45 In this instance, there are debates about the role of technology in supervising children vis a vis an in-person adult presence in doing so. 46 A current CSTB study on privacy in the information age will address these issues. 47 Security for this "one-click" button is an important element of help--the functionality of the button must not be disabled, as it is in mousetrapping (when the "back" button sends the user to a new adult-oriented Web site). 48 It is true that in schools or libraries a child should be able to request help from these helpers individuals without instant help features. The primary advantage of clicking an instant help icon is that it can be done privately and without drawing attention from other users. 49 A screen name is an online identity, similar to a CB radio "handle." An individual may have multiple screen names, and a user with appropriate privileges (usually associated with paying for the master account) can create arbitrary screen names at will for himself or someone else on his account as long those names are not already in use.
|
![]() |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
![]() |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
![]() |
![]() |
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||