|
|||||||||||||||||||||||||||||||||||
|
![]() |
This chapter provides a general framework for understanding what it means to protect youth from inappropriate material. Part of the complexity of the task is seen in the observation that "protect" and "inappropriate material" are terms without clear and unambiguous definition.
The determination that particular material is inappropriate for children begins with a human judgment. The judging party can be a parent, a teacher, a librarian, the child himself or herself, the creator of the material, the carrier (distributor) of that material, a third-party rating service, or a government agency. Given a particular universe of material (e.g., a set of images), it is likely that any group of judges will agree on some material as "appropriate" and some as "inappropriate," and that there will be some material about which the judges will disagree. Of course, depending on the particular judges involved, the term "inappropriate" could include a very large variety of material, including some or all sexually explicit material, bomb-making recipies, extremist material, birth control information, hate sites, and the platforms of particular political or social groups. Indeed, judgments about inappropriateness are closely tied to the values of those making the judgments. The general pattern would not change even when judgments are confined to a specific area such as sexually explicit material. Some material would be unanimously classified as inappropriate, some would be unanimously classified as not inappropriate, and some would be indeterminate. Judges drawn from different segments of the U.S. public would disagree strongly about whether certain materials were inappropriate for children, and what is obscene or obscene with respect to minors, in California may well be different from what is obscene or obscene with respect to minors in Tennessee because the community standards of California may be different from those of Tennessee. In other words, as the Supreme Court has made clear, the determination that material is obscene, or obscene with respect to minors, depends on criteria that are extrinsic to the material itself. In the absence of universal criteria, the most relevant standard for validity or accuracy is the standard of the individual(s) responsible for making decisions about inappropriateness on behalf of the child. Thus, it is often said that the best protection against children being exposed to inappropriate material on the Internet is the presence and guidance of a responsible parent or guardian while the child is using the Internet--the reason is that when such an adult is involved, his or her standards can be trusted as the basis for good judgments for the child's welfare. When the presence and guidance of a responsible parent or guardian during Internet use are not possible, protection depends on the judgment of a proxy. The proxy's judgments must then be evaluated against the standard of the responsible adult who would otherwise make such decisions for the child. The proxy may be another adult such as a teacher acting in loco parentis, the board of a public library system, another trusted adult, an Internet service provider, a developer of filtering software,1 a local jury deciding an obscenity case, the U.S. Congress, or a state legislature. But whatever the proxy, the validity of the proxy's assessments about inappropriate material is indicated by its consistency with the judgments of the party responsible for deciding on behalf of the child. Different proxies have different methodologies for determining inappropriateness. A proxy that uses one reviewer is likely to have a different set of consistencies than an agency that has a hundred reviewers. The former is likely to have a higher reliability than the latter, while the latter is more likely than the former to establish a standard for accuracy or validity that reflects the larger population from which those reviewers are drawn. Moreover, even if agreement can be obtained on particular content (e.g., that a given image is inappropriate), it is hard to define clear rules about how to identify "similar" material in a consistent manner. Put differently, it is difficult to articulate a set of rules that clearly defines a specific class of "inappropriate" material that is both sufficiently inclusive (includes most or all material that is inappropriate) and sufficiently exclusive (excludes most or all material that is appropriate). Given a need for proxies, how does a proxy make reasonable judgments about what sexually explicit material may be inappropriate for viewing by children at any age? To answer this question, it is important to distinguish conceptually between reliability (or consistency) and validity. The most common example to illustrate the distinction between reliability and validity is that of a shooter aiming at a target and firing multiple rounds. Reliability is a measure of the repeatability of the shooter's aim. A tight cluster of shots indicates high repeatability (reliability). However, reliability says nothing about how close that cluster of shots is to the actual target. For that, validity indicates the accuracy of those shots: a small distance between the center of the cluster and the center of the target indicates high accuracy (validity). Validity means that one is measuring what one claims to be measuring. A highly reliable measure may not be valid, and a highly valid result may not be reliable.2 In determining whether certain material is inappropriate, reliability refers to the reproducibility of an assessment that a given piece of material is inappropriate. To test for reliability, one might ask an individual to evaluate a collection of materials several times under different circumstances. If this individual's judgments were generally consistent, he or she would be said to be a reliable judge. Because the same person may classify a given piece of material differently depending on his or her mood or from one day to another or what else he or she might have seen or read during that period of time, an individual's reliability as a judge cannot be taken for granted. Reliability is even more difficult when multiple people are involved in judging content. Different people may classify a given piece of material differently even if they are ostensibly using the same rules for classification. Thus, to determine the reliability of a group of judges, one might ask them to evaluate a collection of materials; if their judgments were generally consistent with each other, this group would be said to make reliable judgments. Key to this process is an agreed-upon operational specification of how to make a judgment. This definition of reliability implies that machine assessment (a given version of software running on a given computer) is inevitably more reliable (or consistent) than human assessment--given the strictly algorithmic process, the reliability of machine assessment is 100 percent.3 The reliability of human assessment is likely to be considerably lower, even when the same rules of classification apply to all assessments, simply because people interpret the same rules differently. Validity is a more problematic concept because there is no universally accepted standard (no "gold standard") for what counts as inappropriate. In other words, there are no universally accepted criteria that define whether something is obscene or inappropriate for viewing by children. These comments do not exclude the possibility of broad agreement over many possible decision-making parties in a particular instance characterized, for example, by child pornography, just as even the worst-made rifle can make bull's eyes for a particular target with a very large bull's eye. Rather, the accuracy of a rifle is best tested against a target with a very small bull's eye, so that variations in accuracy can be readily observed. Similarly, the validity of a process for making judgments about inappropriateness is best indicated by examining judgments of "questionable" material about which people do not hold similar views. Reliability and validity (as measured from the perspective of a responsible adult) are the primary determinants of a "good" proxy for that adult. Because different proxies have different methodologies, an individual faced with the responsibility for deciding inappropriateness for a child can, in principle, choose to adopt the approach used by a trusted proxy. Thus, in principle, an individual (e.g., a parent) might choose to trust the judgments of Disney, the American Civil Liberties Union, the Christian Coalition, or another organization that has a well-known reputation for espousing a particular set of views or values.
Given the general principles described above, three methods can be used in practice to identify inappropriate material. The decision-making party, whether machine or human, can
A second dimension of identifying inappropriate material in practice is a consequence of the size of the Internet. The volume of information on the Internet is so large--and changes so rapidly--that it is simply impractical for human beings to evaluate every discrete piece of information for inappropriateness. Moreover, the content of some existing Web pages changes very rapidly, and new Web pages appear at a very rapid rate.5 This fact leads to one of two approaches. One is based on an automated, machine-executable process for identifying inappropriate content (generally used for evaluating content "on the fly"); such processes are less expensive than manual processes for screening content, but as a rule, they result in judgments that are less valid than those of human evaluators. Automated approaches must be based on machine-executable rules that are abstracted from human judgments--such abstraction inevitably misses nuances in those human judgments and reduces their validity compared to that of humans. Reliance on labels depends on the label's accuracy, and only a human's action binds a particular label to a given piece of material. And, the identity of a source may not be an adequate indicator of material coming from it. The second approach is the human-performed explicit identification of specifically appropriate material and the exclusion of everything else (or its reverse--the explicit identification of specifically inappropriate material for exclusion and the inclusion of everything else). This approach is expensive. The first variant runs the risk that large volumes of appropriate material will be excluded, but is likely to result in a very low fraction of inappropriate material being displayed. The second variant captures only some (perhaps a large fraction) of the inappropriate material on the Internet, but allows a large volume of appropriate material to pass. In practice, a combination of approaches can be used as well--a combination would rely on an automated search for material that might be inappropriate, but would rely on a human judgment for the final determination. In this case, inappropriate material not identified in the automated search does not reach a human decision maker and so it mistakenly identified as appropriate. To the extent that the threshold of identification is set so that a greater volume of possibly inappropriate material is identified (so as to reduce the likelihood that inappropriate material is not mistakenly identified), the volume of material that is subject to actual human judgment is increased--in which case the issues regarding human review discussed in the previous paragraph obtain.
In the context of "protecting youth from inappropriate material or experiences," the term "protect" has a number of plausible definitions. "Protection" can mean:
The approaches needed to implement these definitions of protection differ but are not mutually exclusive. Protecting children from deliberate exposure is a very different enterprise--and entails very different costs--compared with protecting children only from inadvertent exposure, though approaches to each may have common elements. Further, teaching a child to cope constructively is likely to be helpful in the event that other protection measures fail. The primary difficulty of protecting children and youth against deliberate exposure is the fact that many adolescents, especially boys, are highly motivated to seek out sexually explicit materials, including material that is ostensibly limited to adults. Absent a significant change in their desire for such materials, their motivation often enables them to overcome many obstacles in their path, whatever the nature of those obstacles. "Protection" has other important dimensions. The extent to which a child can be protected is an important element of the debate. It is easy to say that 100 percent protection is impossible--but such a statement begs the question of what "80 percent protection" might mean. Does it mean that the child is exposed to inappropriate material 20 percent as much as another child who does not have such protection? Should the evaluator rely primarily on the number of incidents of exposure? Or are other aspects of the exposure, such as its nature or duration, important as well? An important challenge--as yet unresolved--is to articulate appropriate metrics of protection. A key dimension of protection is the issue of false positives and false negatives. In deciding whether a given piece of material is appropriate or inappropriate, it is inevitable that some material will be designated as inappropriate when it should have been designated as appropriate--errors of this type are false positives. On the other hand, some material will be designated as appropriate even when it should have been designated as inappropriate--errors of this type are false negatives. Perfect protection ("100 percent") is possible only at the cost of many false positives--and is in fact simple to provide. A cable with an air gap in it (or, equivalently, disconnecting from the Internet) will provide 100 percent protection in the sense that it will screen out all inappropriate material. But it will also block all appropriate content as well. In the real world, those wishing to provide protection (e.g., parents, school administrators, and so on) must balance the rate of false positives against the rate of false negatives. A third dimension of "protect" might refer to measures taken to ensure that the child suffers no ill effects from being exposed to inappropriate material. This approach recognizes that absolute protection from inappropriate material is impossible to achieve (except at the cost of total disconnection) and that nurturing a child who is able to place inappropriate material into context, to evaluate and make good judgments about it, and to deal with it appropriately is an important element of the overall goal of keeping children from harm. The relevance of these different definitions and dimensions of "protect" depends on the developmental maturity of a child. The range from birth to 18 spans a very broad developmental range in a psychological sense. What may be developmentally inappropriate for a very young child may not be inappropriate for a teenager. It can be taken as a given that teenagers will be more interested in sexual matters than will young children, and a site providing a detailed scientific description of human reproduction, for example, may be more appropriate for the former than the latter. Parents and other responsible adults may thus wish to consider developmental factors in weighing the costs and benefits of different approaches to protection. For example, at very young ages, one might be inclined to shield children from any Internet exposure at all (e.g., confining a very young child's computer experience to CD-ROMs). As he or she matured, a next step might be to allow online experiences only in environments that were specially created for children. Then, at a further stage of maturity, one might seek to educate one's child in safe Internet use and use technology to screen out objectionable material. Finally, one might simply allow unconstrained Internet access but keep an eye on the child's Internet activities. The assumption underlying such a progression is that as the child matures, his or her judgment and capability for responsible action grows, and different kinds of "protection" are necessary. (All of these options are discussed in subsequent chapters.) In the context of protecting children from pornography and perhaps other types of material as well, it is also important to distinguish between protecting children from being exposed to pornography (the subject of the discussion above) and protecting children from participating in pornography (which is the ultimate and primary rationale for laws that strictly proscribe child pornography). The discussion below and in Chapters 9 through 13 focuses primarily on protection from exposure, but addresses protection from participation as appropriate.
Actions can be taken at a variety of times. Actions can be taken to protect a child before potential exposure to inappropriate material, at the moment of that exposure, and after that exposure. For example, as Chapter 10 discusses, various education and media literacy strategies may help the child to decide not to be exposed to inappropriate material. Such strategies must obviously be implemented before exposure--that is, the child must begin to learn (internalize) norms of appropriate conduct and behavior. At the moment of potential exposure, tools such as filters (described in Chapter 12) are relevant, because they can help to block exposure to such material. And, after exposure, a variety of measures (e.g., getting help, reporting to a responsible adult) can help to mitigate any negative effects that may result from such exposure.
Even the same community, libraries and schools often have very different policies regarding the need for filtering. In some communities visited by the committee, schools had implemented filtering for all school-based Internet connections, whereas the public libraries in that same community did not have filters. To understand why such differences might arise, consider the different purposes served by public schools and libraries. A public school serves the primary purpose of providing academic instruction for individuals who have not attained the age of majority. Parents send their children to school in the legally enforceable expectation that school personnel will take responsible care to protect them from harm and to provide adequate instruction (i.e., that they will act in loco parentis). By contrast, a public library serves the primary purpose of providing a broad range of information to the entire community in which it is based. This community includes children, but also includes adults, and the information needs of the community--taken as a whole--are generally much more diverse than those of children and youth in school.6 Thus, given these differing purposes, it is not at all unexpected that school and libraries have different needs and might take different approaches in seeking to protect children and youth from inappropriate Internet material and experiences. Approaches of schools and of libraries for protecting children and youth from inappropriate Internet materials and experiences must be assessed against this backdrop. Some elements of these approaches will be similar, but it is also to be expected that other elements will be different because of differences in fundamental purpose.
Within this framework of different dimensions of "protection" and "inappropriate material," a range of approaches to the definitional process can be identified. One approach, rarely stated but often implicit as the motivating force behind certain policy positions, is the idea that a particular definition of "inappropriate" is appropriate for all communities. Underpinning this approach is the view described in Box 7.2 in Chapter 7 that inappropriate material--and in particular, sexually explicit material--can be so dangerous that even a single exposure of a child can result in very harmful consequences. Examples to support this point of view tend to be drawn from the more extreme sexually explicit material found on the Internet--and not coincidentally material that tends to arouse revulsion in a large segment of the population. Thus, "protection" of children must be as airtight as possible, and false positives that improperly classify appropriate material as inappropriate are preferable to false negatives that improperly classify inappropriate material as appropriate.7 Note also that if one believes that even one exposure of a child to such material is likely to have long-lasting and profoundly negative effects on his or her development, then nothing less than perfection suffices. For such individuals, technologies that seek to isolate and wall off a child from untoward influences are likely to have considerable appeal. A second approach asserts that individual communities have the right (and obligation) to define for themselves what is inappropriate. Believers in this philosophy reject the notion that there are, or that there should be, universal standards of inappropriateness. To buttress their view, they tend to draw on examples of material that many people would not find inappropriate or offensive--information that can be characterized as more "mainline" than "extreme." They tend to doubt that exposure to sexually explicit material will have a damaging impact on most children, and thus are willing to accept false negatives as the cost of avoiding false positives. For those who believe in the resilience of children when given appropriate guidance, social and educational strategies that teach a child how to make good decisions about such material and how to cope with it should exposure occur are likely to have more appeal than approaches that rely primarily on technology. These two approaches represent polar opposites. In practice, most people fall somewhere in between. That is, they might well say that some types of material should indeed be universally prohibited, regardless of community standards (e.g., child pornography), but conclude that for other types of sexually explicit material, communities should be free to decide for themselves. A third approach, in general independent of those described above, focuses on empowering children to protect themselves. This approach calls for adult definition of inappropriate material (and could be defined either universally or on a community-by-community basis, but in any event would ideally account for developmental maturity) but focuses on the child rather than the adult as the primary decision maker. This does not mean that adults have no role, or that protective mechanisms are inappropriate, but rather that reliance on externally driven protective mechanisms can be reduced as the child's judgment increases. (The appropriate analogy would be that externally driven protective mechanisms for a child whose judgment is developing and maturing serve the same role as "training wheels" serve for someone learning to ride a bicycle.) Under this approach, the goal is not to protect the child from exposure as an end in itself, but rather to educate the child to cope well with offensive or inappropriate materials. This approach is based on the idea that in a free and pluralistic society, one will inevitably be exposed to material that he or she may find disturbing, offensive, or distasteful, and that the best way to deal with such situations is to develop coping mechanisms through experience. It is also important to consider the time scale over which politics is conducted. In one site visit to a state generally regarded as politically and socially conservative, discussion with those familiar with the local community revealed a difference in the politics associated with schools (which did have filtering for all its schools) and libraries (which did not). The membership of the school board had changed relatively suddenly, on the time scale of a year (one election), whereas the membership of the library board changed very slowly. Thus, filtering could be driven by new members of the school board, whereas library board membership was more stable--and thus changes to policy took longer to implement in the absence of sustained and demonstrable public concern.
Chapters 9 through 13 address the challenges of protecting children against exposure from inappropriate sexually explicit material. This section introduces some general concepts for protection that underpin more specific approaches:
It is important to note that the techniques discussed above are not mutually exclusive. For example, a message indicating that access to material has been blocked can also be accompanied by suggestions for alternative but appropriate material instead. Indeed, combining some techniques may enhance effectiveness; such instances must be determined on a case-by-case basis.
Society has responded to concerns about the possibility that children and youth might be exposed to inappropriate material in a number of ways. For example, consider the following:
However, these examples do not illustrate a social consensus on their appropriateness or desirability. For example, most parents do not use V-chips in controlling their children's television usage () and do not remove their children from sex education classes in school.10 In the case of removing materials from libraries, a formal hearing process often guarantees that removal will have a high political profile in the community involved, and constitutional issues arise as well. Nevertheless, there is one example of action taken in this area that the committee believes provides a helpful analogy. In particular, many communities deal with the issue of preventing minors from reading and viewing adult-oriented magazines available at a newsstand through a number of steps. Children are taught by parents not to seek out such magazines, and parents refrain from leaving such magazines lying around the house (social and educational measures). In the stores, adult magazines are placed on the highest shelves, so that they are harder for shorter people to reach (a social measure). They are sealed in plastic, so that browsing them in the store is more difficult (a technology measure). An opaque "belly band" obscures the picture on the front (another technology measure). The picture itself, while sexually suggestive, does not depict either overt sexual activity or the genitalia of the models (a public policy measure). A purchaser of such magazines must usually make the transaction face-to-face with the operator of the newsstand, who can generally distinguish between a 12-year-old boy and an adult and whose very presence helps to deter some minors from even seeking to make such a purchase (another social measure). The operator of the newsstand may be subject to prosecution if he knowingly sells an adult magazine to a minor under state and local laws that prohibit such sales to minors (another public policy measure). All of these measures combine into an approach to the sale of adult magazines that has not absolutely prevented minors from viewing their contents, but there is not much controversy about the approach, imperfect though it is. In other words, there is a rough social consensus that this approach is for the most part not unreasonable, either in its implementation or its philosophy. It provides some protection, though the protection is not absolute, and the burden it places on magazine vendors and adult consumers is not particularly significant. Further, the protection that it does provide is adequate for publishers and vendors of adult magazines to be able to plausibly assert that they are taking reasonable steps to keep such material out of the hands of children. How should one view approaches to protecting children from inappropriate sexually explicit material on the Internet? While the discussion of the corner newsstand above is helpful in understanding how to think about this issue, cyberspace is unlike a corner newsstand in many ways, and much of the controversy about inappropriate sexually explicit material on the Internet arises because of the differences between the corner newsstand and cyberspace. Nevertheless, a reasoned approach to protection in cyberspace also relies on the three generic elements mentioned in the corner newsstand issue--social and educational measures, technology, and public policy. There is a broad consensus that the ideal approach for keeping kids safe on the Internet--in any dimension one wishes to consider--is the presence of a trusted, responsible adult providing guidance, pedagogy, and supervision along with encouragement and understanding. But as a practical matter, many children are likely to have some degree of unsupervised access to the Internet or other online services (e.g., because of the infeasibility of continuous parental monitoring of children's Internet use). While parents with the education, resources, and time to understand and deal with the problem may approach the ideal described above, others who do not have these benefits also seek solutions to what they see as a real problem. It is thus not unreasonable to consider a combination of protective elements whose use can reinforce one another, though the closer one can get to the ideal described in the preceding discussion, the better. For example, parents can place computers at home in public places so that they can better supervise the Internet use of their children. Schools and libraries can teach children to avoid behavior that can lead to inadvertent exposure to inappropriate sexually explicit materials. Technology can be used to create Internet environments for very young children in which all content is specifically designed to be age-appropriate and educational. Public policy measures may be able to reduce the amount of adult-oriented, sexually explicit material that is accessible without some kind of age verification. (All of these elements, and others, are discussed in the following chapters.) The possibility of such synergistic effects should not be taken to mean that one particular combination of protective elements is right in all cases. Furthermore, no combination of protective elements will be free of trade-offs, both within the space of protecting children and youth from inappropriate sexually explicit material on the Internet and with respect to how this space interacts with other societal issues. Thus, the appropriate combination of protective elements will vary substantially across different communities and styles of parenting, and it is important that parents and communities retain the ability and authority to determine what is best for their children--and how to make those tradeoffs--in accordance with their own values. Chapters 9 through 13 address a variety of options for public policies, strategies, and tools, and some of the benefits of coordinating their use.
Box
Notes1 The developer of software builds into a tool assumptions about what content is appropriate (and for whom) and also establishes the standard of accurate performance that the tool will meet. 2 For practical purposes, a variable with low reliability (i.e., with a wide spread) is hard to use for empirical work. However, it can still be highly valid in the formal sense of the term, even if it is not useful. 3 This statement must be qualified if the assessment software relies on elements that change over time. For example, the assessment software may rely on a database that is updated from time to time--with new or additional data, the results of the assessment of, say, a given image may well vary. 4 Sources of adult-oriented, sexually explicit material are generally consistent in their offerings--such content is available on their Web sites today, and will be available tomorrow (if they are still in business). Yet, the content on one such site is often generically similar to that on another site, so "brand loyalties" that keep viewers coming back are hard for any one business entity to establish. The most successful commercial entities are those that have been able to establish such loyalties. 5 No data was available to the committee on the first point, but on the second point, the number of publicly accessible Web pages increased by 300 million in the period from July to October 2001 (according to the Google statistics on the number of pages indexed). 6 For example, the position of the American Library Association is that "librarians believe in supporting a wide variety of information needs," and emphasis should be placed "on the patron's right to choose," an emphasis that is "consistent with [the library] profession's commitment to intellectual freedom." Further, "libraries rarely limit what can be read in a library. Librarians do not search patrons' book-bags for titles the library would not purchase, or police reading tables to see if patrons are reading materials consistent with local collection-development policies. In a similar vein, many libraries offer open access to the Internet, so that the patron may choose what to read." See <http://www.pla.org/publications/technotes/technotes_filtering.html>. 7 Child pornography is an area where a similar philosophy has led to strong legal protections against the harms of child sexual abuse that typically define the material. 8 Note the existence of "gray." "Gray" may be material that, while not exactly inappropriate for children, is regarded as not particularly helpful to them--"waste-of-time" material. For example, one could imagine some schools putting Web sites that relate to games, sports, entertainment, and celebrities on a "black list" because spending time on such sites and with such material would not be regarded as having educational value. On the other hand, creative teachers may well be able to extract educational value from such sites--and blocking such material might reduce the educational opportunities available. 9 For example, Section 51550 of the California Education Code states that "No governing board of a public elementary or secondary school may require pupils to attend any class in which human reproductive organs and their functions and processes are described, illustrated or discussed, whether such class be part of a course designated "sex education" or "family life education" or by some similar term, or part of any other course which pupils are required to attend....Opportunity shall be provided to each parent or guardian to request in writing that his child not attend the class, [and] no child may attend a class if a request that he not attend the class has been received by the school." Available online at <http://www.leginfo.ca.gov/cgi-bin/waisgate?WAISdocID=07839714985+0+0+0&WAISaction=retrieve>. 10 When given a choice, only 1 percent to 5 percent of parents remove their children from comprehensive sexual education classes. See Kaiser Family Foundation. 2000. Sex Education in America: A View from Inside America's Classrooms. The Henry J. Kaiser Family Foundation, Menlo Park, Calif.
|
![]() |
||||||||||||||||||||||||||||||||
|
|||||||||||||||||||||||||||||||||||
![]() |
|||||||||||||||||||||||||||||||||||
![]() |
![]() |
|
|||||||||||||||||||||||||||||||||