|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
![]() |
The Internet has enormous potential to contribute to public welfare and private well-being. One dimension of that potential involves the use of the Internet to enhance and transform education for the nation's youth, and many public policy decisions have been taken to provide Internet access for educational purposes. Easy access to the Internet (and related online services) has many advantages for children--access to educational materials; collaborative projects, publications, online friendships, and pen pals; access to subject matter experts; recreation, hobby, and sports information; and so on. While such potential for contributing to the nation's welfare in general and to the education of its children in particular is recognized, the Internet also presents to the public a wide variety of concerns. This fact in itself should not be surprising--few powerful and widely deployed technologies have been used solely for socially beneficial purposes. But the Internet poses many challenges for which there are no precedents, and much of the controversy about inappropriate sexually explicit material on the Internet arises because of these differences.
What is the issue to be addressed? Although the nominal title of the project was "Tools and Strategies to Protect Kids from Pornography on the Internet and Other Inappropriate Material," a key fact is that "pornography" is a term whose meaning is not well-specified. People nevertheless use the term as though it did have a well-specified meaning, and they often fail to recognize that what one may consider pornographic, another may not. For this reason, the committee chose the term "inappropriate sexually explicit material" when in common parlance it might have used the term "pornography." Using the former term keeps in the foreground the question of "inappropriate according to whose standards?" Internet exposure of children to sexually explicit material is only one dimension of exposure, albeit important, because sexually explicit material and other sexual content exist in a wide variety of other commonly accessible media such as video cassettes, magazines, and cable television. Further, concerns over obscenity may well be a proxy for the desire to suppress access to other sexually explicit or sexually oriented content that would not be judged legally obscene. Internet exposure of children to inappropriate sexually explicit material is also only one dimension of inappropriate or potentially dangerous activities in which youth may engage. The Internet is also a medium that can facilitate face-to-face meetings between people who do not know each other prior to their Internet contact, and when there is a great disparity of experience and age between these parties, the younger less-experienced person could be more subject to exploitation and physical danger. Other types of material may also be judged by various parties to be inappropriate for children. Some of the approaches to protection from sexually explicit material may be applicable to such other material. The views of people about "pornography" on the Internet and what to do about it reflect a broad range of values and moral commitments. What is pornographic to some people may be simply mainstream advertising to others; what is morally wrong to some may be entirely acceptable to others; what is legal to show to minors in one community may be regarded as wholly inappropriate by those in another community; and what counts as responsible choice according to one set of values may be irresponsible behavior according to a different set of values. Approaches taken to protect children should be flexible enough to honor that diversity.
Children from birth to the age of legal majority pass through a wide range of developmental stages as they mature into adults (and furthermore the age of legal majority is not statutorily uniform). The impact of any given piece of sexually explicit material is likely to vary widely with age or, more importantly, level of maturity, and the approaches taken to protect children of a given maturity level should take into account the characteristics of their level of maturity. Moreover, the experiences of individuals can influence how sexual content affects them, especially considering that increasing numbers of adolescents, who are still legally minors, are sexually active. Finally, age usually affects the extent to which children can understand dangers and engage in safe behavior. The information needs of children that the Internet can and should meet also change with the developmental stage of the child in question. For example, juniors and seniors in high school have a much broader range of information needs (i.e., for doing research related to their education) than do those in the third grade or in junior high school. This, in turn, leads to the question of how to provide older children with access to a broader range of material while preventing younger ones from accessing material that is not deemed appropriate given their developmental level.
As a matter of law, sexually explicit material that is "obscene with respect to minors" must be made available to adults without restriction, though it can be restricted for minors. Certain other sexually explicit materials (obscenity, child pornography) enjoy no First Amendment protection at all. Material that is determined to be obscene or obscene with respect to minors must pass certain tests, including tests related to community standards. For both classes of material, the community standards for making such determinations likely change over time, and in recent years, mores about sex and consumption of sexually explicit material may have changed in such a way as to reduce (but not to eliminate) the scope of both categories. Thus, there is in practice considerable ambiguity about what should fall into these categories, and the fact that community standards are integral to the application of the law in this area means that material cannot be determined to be obscene or obscene with respect to minors solely on the basis of the material itself. Over the past decade, the number of federal obscenity prosecutions has been very small compared with those in previous years, thus complicating to a significant degree the concept of "community standards." The First Amendment is relevant regarding the extent to which and circumstances under which public institutions of various types can restrict access to particular types of information. Finally, in the public policy domain, U.S. regulation of sexually explicit material is most likely to have an effect on commercial sources inside the United States, and far less effect on sources located abroad.
Making some material available to adults but not to children requires that providers have a reasonably reliable way of differentiating between them. In the physical world, such differentiation can often be accomplished with reasonable ease (e.g., by checking a driver's license or other identification). But in the Internet context, rules based on age differentiation are highly problematic and technically difficult to enforce. Content providers must also have a clear understanding of the difference between material that is and is not inappropriate for children. Although many of the issues concerning Internet access to various types of material that may be regarded as inappropriate arise for other media as well, the Internet changes significantly the convenience and anonymity of access, thus reducing certain constraints that may be operative in other media. For example, online chat rooms and instant messages (IMs) have few analogs in the physical world, and these are channels through which a great deal of communication between strangers can occur. For this reason, special attention to the Internet dimensions of the issue may be warranted. The adult online industry notwithstanding, inappropriate sexually explicit material is available from many non-commercial online sources. Thus, approaches that focus primarily on access to inappropriate sexually explicit material provided by the adult online industry (widely seen as the crux of today's problem) are likely to have limited relevance to problems arising from non-commercial sources. For a great deal of inappropriate sexually explicit material (specifically, material accessible through Web sites), a reduction of the number of Web sites containing such material, in and of itself, is not likely to reduce the exposure of children to such material. The reason is that a primary method for obtaining access to such material is through search engines, and the likelihood that a search will find some inappropriate material for a given set of search parameters is essentially independent of the number of Web pages represented in that search. That said, if the number of such Web sites is small enough that no Web site operator can flout the rules of responsible behavior with impunity,1 regulation of their behavior (through public policy and/or self-regulatory approaches) becomes significantly easier, and enforceable codes of responsible behavior can have a significant impact on the extent to which operators of Web sites that contain adult-oriented, sexually explicit material make their products and services accessible to children.
The adult online industry is one of the primary sources of sexually explicit images (e.g., on "teaser" home pages) that are accessible without any attempt to differentiate between adults and children. Such teaser pages allow potential customers to sample what would be available with payment, but children have easy access to the free content. The sexually explicit material provided by the adult online industry is available to children through a variety of routes, including mistyped Web site addresses, links returned by search engines in response to search terms with sexual connotations, and spam containing links to adult Web sites. The revenue models of the adult online industry suggest that broad exposure is needed to attract potential customers, and so the industry engages in tactics that seek to generate the broadest possible audience. Moreover, these tactics to gain exposure cannot be used at low cost if they are to differentiate between adults and children. The result is that children can be "swept up" in the industry's reach for larger audiences of potentially paying customers. The adult online industry is only one component of supply. The low cost of creating and maintaining a Web site means that the production of sexually explicit material is now within the financial reach of almost anyone. For example, Web cameras can be purchased for under $100, enabling anyone so inclined to produce a video stream of sexually explicit material. In the Internet environment, an astronomically large volume of material is available for free, including art, literature, science, advertising, and government information, as well as sexually explicit material of every variety. Restricting what any individual may access (or protecting him or her from certain kinds of material) will inevitably impose additional costs on users. Such costs may include denial of access to useful information and loss of privacy for those wishing to access certain kinds of information.
As described in Chapter 6, factors such as certain ethical and legal considerations, an increasing conservatism of university review boards that approve research studies involving human subjects (institutional review boards), and a lack of research funding have contributed to a paucity of research regarding the impact on children of exposure to sexually explicit material. Furthermore, the extant scientific literature does not support a scientific consensus on a claim that exposure to sexually explicit material does--or does not--have a negative impact on children, and there is no adequate research base for understanding the impact of sexually explicit material of various kinds and how different approaches to protection may vary in effectiveness and outcome. It is important to consider why many young people search for adult-oriented sexually explicit material in the first place. Adolescents go to these sites for many of the same reasons that adults do. Human beings are sexual. Sexuality is a part of identity, and a facet of identity that is a focus during adolescence when youth come of reproductive age. It is not surprising that many children--especially preadolescents and older--are curious about sex, and adolescents who are sexually mature are looking for information about sex and are making choices in this arena. In other earlier eras, they might well be married, but today in Western culture marriage among those in their early and mid-teens is frowned upon. To the extent that adults (parents and families, schools, libraries) provide accurate information and guidance about sexuality in its biological, psychological, emotional, and social dimensions--and information and guidance that is responsive to the situations that their children are facing--it can be argued that young people will be less drawn to searching for adult-oriented sexually explicit material. This is not to say that parents are wrong to be concerned about their children's exposure to sexually explicit material. There is no reason to suppose that all negative impacts from exposure are necessarily shown or manifested in science-based research studies. The moral and ethical values of parents--whether or not religious in orientation--and a desire to be involved in providing context and guidance for a child exposed to such material are important and understandable drivers of such concerns. The committee believes that it would not be difficult to come to a consensus on the undesirability of some set of sexually explicit material involving depictions of extreme sexual behavior. That is, such a set could be developed by construction--image after image could be shown to a group of individuals drawn from a broad cross section of the community. (In some ways, the committee consists of just such a group). Under this procedure, the images that everyone on the committee deemed inappropriate for children would constitute the set--and the set would be substantial in size. Such a consensus would not be based so much on scientific grounds (as the committee knows of no reliable scientific studies that address this point) as much on a sense that such exposure would offend its collective moral and ethical sensibilities. Furthermore, the committee believes that a significant fraction of this set would likely be deemed obscene if prosecuted. Yet, the fact that such a set could be defined by construction does not mean that it is possible to craft unambiguous rules that define this set without capturing material that would either be protected speech under existing First Amendment precedents or unobjectionable to some number of group members. And, in the absence of such rules, disagreement is inevitable over what else other than "similar" material should be captured in any definition. The story is quite different for child pornography. In contrast to the diversity of views about what material must count as obscenity or obscene with respect to minors (and hence a diversity of views on what harm might result to children from being exposed to such material), there is a much broader social consensus that child pornography results in harm to the children depicted in such images and that child pornography is morally wrong as well.2 Over the past decade, the incidence of child pornography has risen as new communications channels such as the Internet have facilitated the exchange of child pornography. A similar argument applies to sexual predation. By design, the Internet facilitates contact between people who do not know each other. While much that is good and valuable and safe can come from interactions with strangers, parents rightly have some concern when their children talk to strangers in an unsupervised manner. These concerns arise in the physical world, and they are magnified in the online environment--where the range of personality types and intentions is both less known and less controllable. Further, the Internet has enabled potential predators to seek out a wider range of vulnerable children. The committee believes that the issue of face-to-face meetings between children and their Internet acquaintances is very different from that of being exposed to inappropriate material on the Internet because the potential dangers that face-to-face meetings entail are much greater. Furthermore, while the majority of children report that they brush off aggressive solicitation encounters or treat them as a relatively minor annoyance, a significant minority do report being upset or disturbed by them (Section 5.4.3). In addition, even when children were distressed by such encounters, a large fraction of them did not report the incident to parents or other authorities. Finally, the committee believes that there is a consensus regarding involuntary exposure to sexually explicit material. Regardless of one's views on the impact of voluntary exposure to sexually explicit material, the committee believes that there is a reasonably strong consensus--indeed, one reflected in its own deliberations--that involuntary Internet exposure to sexually explicit material is inappropriate and undesirable and should not be occurring, and it is particularly inappropriate and undesirable in the context of minors being exposed to such material.3
The discussion in this section (Section ) is complementary to the findings and general observations in Chapters 8 through 13, but does not repeat them systematically. Readers are urged to consult those chapters for more specific findings--especially about technology-based tools such as filters and monitoring programs. Much of the debate about "pornography on the Internet" focuses on the advantages and disadvantages of technical and public policy solutions. Technology solutions seem to offer quick and inexpensive fixes that allow adult caregivers to believe that the problem has been addressed, and it is tempting to believe that the use of technology can drastically reduce or even eliminate the need for human supervision. Public policy approaches promise to eliminate sources of the problem. In the committee's view, this focus is misguided: neither technology nor public policy alone can provide a complete--or even a nearly complete--solution. As a rule, public policy aimed at eliminating sources of sexually explicit material can affect only indigenous domestic sources, and a substantial fraction of such material originates overseas. Nor is technology a substitute for education, responsible adult supervision, and ethical Internet use. For these reasons, the most important finding of the committee is that developing in children and youth an ethic of responsible choice and skills for appropriate behavior is foundational for all efforts to protect them--with respect to inappropriate sexually explicit material on the Internet as well as many other dangers on the Internet and in the physical world. Social and educational strategies are central to such development, but technology and public policy are important as well--and the three can act together to reinforce each other's value. Social and educational strategies are a primary focus of the committee because most children are likely to be confronted, on occasion, with material that they--or their parents--regard as inappropriate, or find themselves in online situations that are potentially dangerous. Parents must balance their concerns about exposure to harmful things on the Internet against the benefits gained from exposure to positive things on the Internet, and the question of how children can learn to handle and defend themselves becomes the primary issue. Social and educational strategies that promote and teach responsible decision making are at the core of such defense. Social and educational strategies are also important for teaching children how to recognize and avoid situations that might expose them to inappropriate material or experiences. Though technology has a role to play here as well, developing "street smarts" about how to avoid trouble is likely to be a far more reliable and robust approach to protection. In short, a child who responsibly chooses appropriate materials to access and appropriate things to do on the Internet and who knows what do to about inappropriate materials and experiences should he or she come across them is much safer than a child whose parents and school teachers rely primarily on technology and public policy to solve the problem for them. Moreover, social and educational strategies to promote and teach responsible choice have applicability far beyond the limited question of "protecting kids from porn on the Internet," because they are relevant to teaching children to think critically about media messages, to conduct effective Internet searches for information and to navigate with confidence, and to evaluate the credibility of the information they receive. Social and educational strategies are not quick or inexpensive, and they require tending and implementation. Adults must be trained to teach children how to make good choices on the Internet. They must be willing to engage in sometimes-difficult conversations. And, because social and educational strategies place control in the hands of the youth targeted, children may make mistakes as they learn to internalize the object of these lessons. However, by understanding why certain actions were mistakes, children will more effectively learn the lessons that parents and other adults hope that they will learn. Virtually all of the high school students to whom the committee spoke said that their "Internet savvy" came from experience, and they simply learned to cope with certain unpleasant Internet experiences. They also spoke of passing their newfound expertise down to younger siblings, hence becoming the new de facto educators for younger kids in the "second wave of digital children." Technology-based tools, such as filters, can provide parents and other responsible adults with additional choices as to how best to fulfill their responsibilities. Though even the most enthusiastic technology vendors acknowledge that their technologies are not perfect and that supervision and education are necessary when technology fails, tools need not be perfect to be helpful--and used properly (an important caveat), they can be an important aspect of comprehensive programs for Internet safety. First, technology can help to create a child-rearing environment that parents can moderate and shape according to their values and the maturity of their children. In an Internet context, a controlled and moderated environment does not mean that a child's every keystroke and mouse click are preprogrammed forever. But it does mean that the child can exercise choices within safe limits set by parents or other responsible adults--and as a child learns more and develops greater maturity, those limits can be expanded in ways that allow greater freedom of choice and that may at some point entail greater risk taking as well. An example of a technology-moderated environment for Internet access might call for the provision of Internet access in young childhood that is explicitly limited to child-friendly content (i.e., access based on "white lists"), strongly filtered Internet access in middle childhood (i.e., access based on extensive "black lists"), less filtered Internet access in preadolescence and early adolescence (i.e., access based on a slow reduction in the number of categories deemed inappropriate), and monitored Internet access in middle to late adolescence (i.e., unfettered Internet access but accompanied by warnings about inappropriate material). Parents wishing to provide a more risk-free environment might delay the introduction of less restrictive measures; those wishing to promote a more free flow of information to their children might accelerate the introduction of less restrictive measures. Second, technology can help to keep parents and other responsible adults informed about what their children are doing online. Of course, the circumstances of obtaining such information matter quite a bit. An intent to provide guidance that helps the child make informed and responsible choices relies on the presence of openness about the presence of monitoring. On the other hand, an intent to "catch" the child in doing something wrong is likely to result in behavior that simply reduces the flow of information to the parent, such as the child obtaining Internet access in a venue that is not monitored. Third, technology offers many desirable benefits: speed, scalability, standardization, and reduced cost. Because non-technical approaches take valuable time from parents, teachers, and others that is badly needed to address many other issues related to raising responsible young people, it will often be the case that a mix of non-technical strategies and technology-based tools provides the most cost-effective way to protect children on the Internet. Choosing the right combination of social and educational strategies and technology-based tools depends a great deal on the nature of the problem that parents, teachers, and librarians are trying to solve. For example, recall from Chapter 8 that deliberate access and inadvertent exposure to inappropriate sexually explicit materials pose different protection problems. Recognizing that responsible adults have to deal with both problems, the fact remains that certain tools and certain strategies are more appropriate for the former than the latter, and vice versa. Tools that warn of impending exposure to inappropriate material rather than blocking are better suited to dealing with the problem of inadvertent exposure. One important point arising from the relative preponderance of males relative to females as consumers of adult-oriented sexually explicit material (Chapter 3) is that adolescent males are probably much more likely to seek out such material deliberately. If so, social and educational strategies that aim at reducing the desire of such individuals to seek out such material may well be more relevant to male children than to female children. As for the role that public policy can play in protecting children, regulation may help to shape the environment in which these strategies and tools are used by reducing at least to some extent the availability of inappropriate sexually explicit material on the Internet. Public policy can help to influence the adult online industry to take actions that better deny children's access to their material and/or influence others in the private sector to support self-regulatory or self-help measures. Furthermore, through prosecution of violators of existing laws that prohibit the transmission of obscene material, public policy can help to some extent to reduce the number of providers of such materials. Successful law enforcement depends on many factors, including appropriately formulated statutes, adequate resources, and a willingness to enforce existing law and regulation. In the Internet safety arena, the participation of citizens (e.g., reporting illegal activity) is also an essential element of law enforcement. Proper training of law enforcement personnel--including those who take complaints, those who investigate complaints, and those who prosecute cases--at federal, state, and local levels remains critical to effective law enforcement. Finally, public policy (or the threat of regulation) can also encourage and promote self-regulatory efforts that contribute to certain public policy goals, such as Internet service providers taking down materials posted in violation of the terms of service to which users agree as a condition of use. Coping with non-commercial sources presents different issues to law enforcement authorities. In order to attract customers, commercial sources must draw attention to themselves, which means that their activities are hard to conceal from law enforcement authorities. But non-commercial sources such as peer-to-peer file-sharing networks may present no easily accessible target for legal action (and under many circumstances can operate invisibly to law enforcement). To address non-commercial sources, law enforcement officials must conduct what amount to "sting" operations.4 Such operations are controversial, are personnel-intensive, and may not offer large leverage, as do non-commercial sources are likely to have significantly smaller audiences than commercial sources. contains an illustration of how social and educational strategies, technology tools, and public policy can work together.
After more than a year of intensive study of the issue, the committee was struck by its extraordinary complexity. Such complexity manifests itself in many ways, but nowhere more prominently than in understanding the trade-offs involved in the development of any comprehensive approach to protecting children on the Internet from inappropriate materials and experiences. The nature of trade-offs is such that doing "better" with respect to one goal implies doing less well with respect to some other goal or goals. In the present instance, any approach that improves protection for children and youth from inappropriate Internet material or experiences is almost certain to have a negative impact on other values or goals that most parents or communities would generally find appropriate and desirable. Note that these latter values and goals may also be associated with children and the Internet. Two points must be made about the existence of tradeoffs. First, the existence of such trade-offs is not an argument, per se, against attempts to "do better" at protecting children and youth from inappropriate Internet material or experiences. Second, the fact that trade-offs exist for any given method suggests that a mix of methods may well be more effective than exclusive or primary reliance on any one method. To illustrate these tradeoffs, the next few sections discuss trade-offs that decision makers must address in considering the use of social and educational strategies, technology-based tools, and public policy actions.
The committee has identified social and educational strategies to teach children and youth how to make good decisions about using the Internet as foundational for any approach to protection. But socialization and education are inherently processes that operate over a long time scale--thus, they cannot be expected to demonstrate immediate results. Furthermore, they are not simple to implement, and they require forethought, planning, and extensive follow-through. They can be costly, both in terms of dollars and in terms of time. Perhaps the most important trade-off associated with using social and educational strategies is that they may conflict with other pressing social and educational needs. For example, most K-12 curricula are already overloaded, and information and media literacy curricula must compete for time in the schedule with physical education, art, music, sex education, consumer literacy, and a variety of other pressures on the curriculum. Education in these areas is also important, and passionate advocates can be found for all of them. Because the amount of time in a curriculum is more or less fixed, there are only three possibilities. One is that something must be removed if something else is added, and the elimination of any given subject area is always controversial, if only because the importance of one subject area must be weighed against the importance of another. A second possibility is that by increasing the efficiency of education in the existing areas of study (so that the same ground can be covered in shorter amounts of time), time can be made available to add information and media literacy. But increasing efficiency is an enormously difficult problem, and in practical terms, it is not clear how to do so. The third logical possibility is to obtain the needed time for information and media literacy by trimming the instructional time devoted to existing subject areas. The risk in this approach is that coverage of those latter areas may become inadequate as a result. The dilemma is no easier to resolve in the family context, where family time together is at a high premium in many families. Parental efforts to supervise children and youth using the Internet must compete with making sure that children clean their rooms, do their homework, get to the soccer or basketball game on time, avoid unhealthy use of drugs and alcohol, work part-time jobs, and so on. The difference in knowledge about technology and its uses between many parents and their children further inhibits informal candid discussion about issues related to the Internet. Siblings and friends charged with providing peer assistance may also play tricks on children to get them into trouble. Finally, educational strategies to teach children and youth to refrain from seeking out inappropriate sexually explicit materials face a powerful challenge in that many adolescents, especially boys, are highly motivated to seek out such materials. Some children do let educational efforts "roll off their backs" and are not influenced by them, and although education and socialization are the only approaches that have the chance of reaching all or most children or that have any chance of reducing a child's desire for such materials, the expectations for such education and socialization should not be unrealistic.
One technology trade-off is illustrated in the balance that users of technology-based tools must strike between two concerns--shielding the child from inappropriate material and experiences against enabling child-centered control over the flow of information to him or her. One can increase shielding--but only at the expense of reducing a child's discretionary control over and access to information. A related trade-off is the issue of false positives and false negatives. While false positives and false negatives in general trade off against each other, even when human beings are actively involved in making judgments about the appropriateness of material to be shown to children, the trade-off is most stark when technology is used to assess appropriateness, and a fundamental reality of technology is that it does not provide an accurate and inexpensive way to assess content. As a general rule, increasing the probability that a device or system will identify inappropriate sexually explicit material as such also increases the probability that some not-inappropriate material will also be improperly identified as inappropriate. If false positives are generally tolerable (i.e., one is generally willing to pass up useful information as the price of protecting against inappropriate material, as might be the case for many risk-averse parents), then the automated assessment of content does have significant utility. Still another technological trade-off arises because of tensions between flexibility and ease of use. Products that are flexible can be customized to the needs of individual users, and most people say that they want flexible products. Yet, a highly customizable product involves many decisions for a user to make, and many users find the actual exploitation of a product's flexibility to be a chore. As a result, the most common use of any technology tool is in its default "out of the box" configuration. Thus, for practical purposes, it is fair that any assessment of a tool place great weight on what the tool does out of the box. Against that standard, many tools for protecting children from inappropriate Internet material and experiences place a far greater emphasis on labeling Internet material and experiences as inappropriate than on carefully selecting whether appropriate material might be labeled as inappropriate. That is, if it might be inappropriate, it is labeled and flagged as such. Filters thus favor overblocking rather than underblocking, while monitors are likely to flag more material as questionable rather than less. This systematic bias in favor of overcaution provokes anxiety in many people. For example, to the extent that technology-based tools are preferred instruments of government policy, concerns arise that government may be implicitly endorsing a lesser degree of information flow to the public--and to children--than would be the case in the absence of technology-based tools. It is also an empirical observation about many tools on the market identifying certain kinds of information as inappropriate for children that their supporters and advocates are perceived as supporting an underlying political agenda. Finally, political pressures and mandates to deploy technology-based tools distort the market, in the sense that they create artificial demand for solutions to problems that communities do not perceive to exist to a significant degree (if they did, they would deploy these tools without such a mandate). Even the protection itself that technology offers with respect to protecting children and youth from inappropriate material involves a potential tradeoff. To the extent that technology-imposed limits on choice work as intended (i.e., to block rather than discourage access to material that may be inappropriate), children and youth lose an opportunity to exercise responsible choice, and hence an associated chance to learn how to make responsible decisions. For younger children--who tend to be less good at decision making than older children--the consequences of bad choices may be more serious, because of their relative inexperience about life and their greater impressionability; for such children, opportunities for unconstrained decision making and choice are not appropriate under most circumstances. On the other hand, children who develops internal standards of appropriateness can be safer in those situations in which they--or their parents and guardians--cannot rely on technology to protect them. This is important given the increasing ubiquity of Internet access points in many venues. When responsible and respected adults and mentors talk with these older children about responsible decision making and establish sanctions for inappropriate choices, they create an environment that encourages and supports responsible choice, which in turn is likely to be conducive to the development of positive habits. Such habits are most important as adolescents reach the age of majority, when they will have the rights of adults and no constraints relative to Internet access. On this point, parents must decide how to proceed.
As noted in Chapter 9, the viability of many public policy proposals depends on how policy makers make certain trade-offs between the goal of helping to shield children and youth from inappropriate sexually explicit material on the Internet and other desirable societal goals. For example, the committee believes that spam containing material that is obscene for minors should not be sent to children. But laws banning such e-mail to minors are potentially problematic in an online environment in which it is very difficult to differentiate between adults and minors. (Indeed, regulation that depends on regulating a certain type of content (namely, sexually explicit speech) is inherently more suspect on First Amendment grounds than are proposals that regulate speech independent of content.) On the other hand, a ban of all spam regardless of content may be seen as too broad because it affects many other interests--for example, those parties with a commercial interest in using e-mail channels to advertise non-sexual goods and services. The committee also believes that it would be desirable for adult Web site operators who exhibit material that is obscene for minors to use age verification systems so that children would not be able to access such material. However, in an online environment in which it is very difficult to differentiate between adults and minors, it is not clear whether denying access based on age can be achieved in a way that does not unduly constrain the viewing rights of adults. Thus, as one illustrative example from Section 9.3.2, the government might offer a grant of immunity from prosecution under obscenity laws to Web site operators that use age verification systems to prevent minors from accessing such material.5 In this instance, the trade-off is helping to protect children from exposure to certain kinds of inappropriate sexually explicit material (such a measure would help to reduce the inadvertent discovery of such material from commercial Web sites) in return for limitations on possible obscenity prosecutions. Aggressive enforcement of obscenity laws also presents tradeoffs. Increased prosecution of obscenity would likely require increased resources, and those resources must be taken from some other activity. If, as is likely, the other activity represents prosecutions of other crimes, policy makers must make the judgment that it would be wise to pursue more obscenity prosecutions rather than other criminal prosecutions, or that more prosecutions for obscenity would necessarily be the best use of additional resources--if such resources are available. Such judgments are complex and require a careful weighing of many competing factors well beyond the scope of this report.
In the committee's judgment, the bottom line on reducing the exposure of children to inappropriate material and experiences on the Internet is that those who rely exclusively, or even primarily, on technology and public policy will find that the resulting protection will rest on uncertain and shifting ground--and is likely to fail their children when exposure to inappropriate material or dangerous situations occurs. If one installs tools and/or passes legislation in the hope that they will "take care of the problem," and that in doing so one's responsibilities will thus be adequately discharged, children are highly likely--eventually--to encounter inappropriate material or experiences on the Internet in some venue, whether by accident or on purpose. And, such an encounter will come as a disturbing surprise to the parent, teacher, librarian, or public policy maker who feels that he or she has done all that needed to be done. In the end, responsible choice--which is foundational for safe Internet use by children--is closely tied to the values that parents and communities wish to impart to their children, and that influence judgments about the proper mix of education, technology, and public policy to adopt. describes some of the behavioral aspects of Internet safety for children that families, schools, and libraries might wish to teach.
Parents have a primary responsibility for guiding children into maturity. They have responsibilities to their children in the physical world, and they have corresponding responsibilities in cyberspace. With respect to Internet usage, developing maturity implies the ability to make safe, responsible, and morally appropriate choices about what to do and what to see on the Internet and a facility for coping constructively with inappropriate or objectionable experiences. describes one possible "best practices" scenario focused on the home. In addition, parents might wish to keep the following points in mind.
Finally, the committee believes that a parental lack of knowledge about Internet culture and the diversity of possible Internet experiences has been--and continues to be--a source of both complacency (because some do not know what on the Internet may be of concern to them and their children) and excessive fear (because they do not know enough to be able to place these dangers in proper perspective). For this reason, parental education about the Internet continues to be an important part of a comprehensive program of Internet safety education for children.
Teachers have a responsibility for educating students and for providing a safe environment in which learning can occur. Libraries have a responsibility for providing the communities they serve with a broad range of useful materials appropriate for their needs. and describe possible "best practices" scenarios for schools and libraries, respectively. In addition, teachers, school administrators, and libraries might wish to keep the following points in mind.
Various components of industry can make a major contribution to the Internet safety of children. The segments of industry relevant to the issue include ISPs and online service providers, makers of access devices such as personal computers, software vendors, content providers, and the adult online industry.
Finally, the information technology industry should not be discouraged from undertaking serious technology-based efforts to help parents and other responsible adults to improve and enhance the safety of their children's Internet experiences and to reduce the amount the inappropriate material to which they may be exposed. Indeed, as a primary beneficiary of the Internet age (as well as being instrumental in creating it), the information technology industry arguably has a special responsibility to help safeguard children on the Internet. Developing more discriminating filters, enabling parental controls, supporting research of the type described in Section , and adopting and promoting labeling schemes are a few of the ways that the IT industry has sought to discharge its responsibilities in this area, and a further enhancement and strengthening of these types of effort can only expand the range of options that parents and other responsible adults can exercise.
Public policy--at the local, state, and federal levels--helps to shape the environment in which Internet access occurs. But because the scope of public policy actions are--by definition--pervasive throughout the community to which those actions are relevant, public policy makers must proceed judiciously. Public policy actions are most effective when they are based on reliable science rather than anecdote, and when they reflect a strong social, ethical, and moral consensus. For example, the sentiment that child pornography and sexual molestation of children are wrong is shared by people among a very broad spectrum of political views. Thus, it is reasonable to say that these are serious national problems, and addressing them continues to be an important task for the nation. Furthermore, the scale of these problems--already large--is increasing, and in any event outstrips the resources available to deal with them. By contrast, public policy makers should tread lightly when it comes to other areas in which a consensus is not so apparent. For example, the committee heard from parents who did not trust the federal government to take actions to reduce children's Internet exposure to inappropriate materials. The striking aspect of this sentiment was that it was expressed by both conservative and liberal parents. Public policy makers should also be wary of cheap, easy, or quick solutions. As the discussion in Chapter 12 on filtering demonstrates, such "solutions" may not fix the problem that they seek to solve--at least not to the extent that they would enable resources and attention to be turned elsewhere. It is true that the cost of social and educational strategies tends at first blush to be considerably larger than the costs of protective technologies--but the benefits that accrue are also correspondingly higher. Students will be more able to avoid problematic experiences and material of their own volition, and will be better able to cope with them when they occur. Finally, it is necessary to underscore the fact that public policy can go far beyond the creation of statutory punishment for violating some approved canon of behavior. Certainly, legal sanctions are one possible public policy option, and such sanctions act both to punish those who behave in a way contrary to law and to deter others from conducting themselves in a similar way. Options such as more vigorous prosecution of existing obscenity laws are discussed at length in Chapter 9. But public policy can be used much more broadly and can shape the Internet environment in many ways. For example, public policy can be used to:
Finally, makers of public policy must keep in mind the international dimensions of the Internet. This does not mean that U.S. actions should not be undertaken, or that they will be wholly ineffective, but expectations for the impact of such actions must necessarily be moderated compared to the case in which the United States is the only significant actor.
As the length of this report suggests, the problem of protecting children from inappropriate material and experiences on the Internet is complex. Reliable information in a number of areas is needed. Indeed, throughout its work, the committee was concerned about the lack of reliable and valid science-based information for many dimensions of the problem it was addressing. Such information would have helped to strengthen committee deliberations.
The Internet offers enormous potential to enhance the intellectual, educational, and social, and personal development of children. Those who take actions to address the concerns described above must bear in mind the potential benefits that the Internet offers. Thus, any "appropriate" mix of actions should be seen as balancing competing goals and values rather than endorsing the absolute supremacy of any one goal or value. Furthermore, evolution with respect to technology and the e-business environment, as well as possible changes in community standards governing obscenity, means that there are no foreseeable technological "silver bullets" or single permanent solutions to be crafted. Rather, any approach adopted to protect children must adapt to changing circumstances. While technology and public policy have important roles to play, social and educational strategies that impart to children the character and values to exercise responsible choices about Internet use and the knowledge about how to cope with inappropriate material and experiences is central to promoting children's safe Internet usage.
Boxes
Notes1 In this context, responsible behavior refers to actions taken to reduce the likelihood that children will obtain access to inappropriate sexually explicit material. To illustrate, one method of inducing Web site operators to act responsibly is to establish codes of behavior to which they must adhere under pain of government enforcement actions (whether civil or criminal). By definition, such an approach requires government action, and with a plethora of operators, the likelihood of being the target of government action is very small--hence the number of operators must be reduced to a "sufficiently small" number. A second illustration of inducing Web site operators to act responsibly is to create disincentives for irresponsible behavior. In this method, the key is to associate disincentives with a large number of parties so that irresponsible Web site operators will feel the pressure of those disincentives, for example, by establishing causes of action allowing those affected by irresponsible behavior to take action against such operators. (One example in a different domain is the establishment of liability (and an associated bounty) for junk faxes, an action that dramatically reduced the number of such faxes.) Whether or not these or other actions can in fact reduce the number of Web sites to a "sufficiently small" number is an open question, especially in a context in which U.S. actions are unlikely to affect Web sites operated by foreigners. Note that these illustrations are just that--illustrations--and their inclusion in the report is not intended to signal endorsement or rejection by the committee. 2 The social consensus is strongest when children are used to create sexual imagery. However, the breadth of the legal definition of child pornography has also led to the attempted prosecution of works of art that involve children in various states of nudity (e.g., the works of Jock Sturges), and it is fair to say that there is less of a social consensus around such material. 3 Are there any circumstances under which involuntary exposure might be beneficial? Perhaps. Consider a situation in which discussions about sex made a child uncomfortable. It might still be a reasonable thing for a concerned parent to have a conversation about sex with his or her child. Needless to say, this kind of situation does not occur frequently in the context of Internet media. 4 One example of a "sting" operation is an exercise in which a law enforcement official assumes an online identity corresponding to that of a minor, and engages potential predators seeking to entice a minor for the purpose of initiating a sexual encounter. A second example is the use of customer lists belonging to an online service that provided child pornography, lists confiscated after the service itself was convicted of child pornography charges. In this latter case, known as Operation Avalanche, law enforcement officials continued to operate the online service's Web site and sent e-mail to subscribers offering them the opportunity to purchase child pornography. Those who responded received controlled deliveries of child pornography made by investigators, and search warrants were executed on the residences of those customers immediately after the deliveries were made (see ABC News, 2001, "An Avalanche of Child Porn: Investigators Use Subscription List to Track Down Pedophiles," November 14, available online at <http://abcnews.go.com/sections/business/TechTV/TechTV_Avalanche_Porn_011114.html>). 5 More specifically, immunity for adult Web site operators from prosecution under obscenity laws would be granted if (a) they take actions to prevent their content from being indexed by search engines, and (b) they provide a plain-text front page that warns users that the site they are about to enter has adult-oriented content and an "exit" button to a child-friendly site if the user acknowledges being under 18. 6 U.S. Department of Education. 2001. Teachers' Tools for the 21st Century: A Report on Teachers' Use of Technology. Available online at <http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2000102>. 7 Non-industrial support for such research may be justified on the grounds that, as discussed in Chapter 12, there is little market incentive for more accurate methods for identifying sexually explicit materials (and hence for more accurate filtering). 8 Note that more effective technology in this area could be used for the benefit of those who want to block such material and of those who want to search more precisely for it. 9 See, for example, <http://disney.go.com/legal/internet_safety.html>, <http://www.safekids.com/kidsrules.htm>, and <http://www.getnetwise.org/safetyguide/>.
|
![]() |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
![]() |
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
![]() |
![]() |
|
|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||