|
||||||||||||||||||||||||||||||||||||||||||||||
|
![]() |
This chapter presents a variety of legal and policy options that the committee considered and discussed during its deliberations regarding how best to protect children from inappropriate sexually explicit material on the Internet. However, it should be kept in mind that the legal and policy environment today (as this report is being written) is highly fluid and rapidly changing. Court cases are being heard, and legislation is pending on areas such as privacy, elimination of spam, and protecting kids on the Internet. Furthermore, even if some of these legal and policy options are helpful for regulating sources of sexually explicit material within the United States, a substantial number of sources outside the direct enforceable jurisdiction of U.S. law will remain. This fact might well limit the success of U.S. legal and policy options for regulating sources of sexually explicit material. provides an overview of the public policy options described in this report.
With all of the difficulties described in Chapter 4 in defining obscenity, it is still the case that material deemed obscene is not protected by the First Amendment. Federal and state obscenity laws impose criminal and civil penalties on the production, distribution, and sale of obscene material. In recent years, however, obscenity prosecutions have been relatively rare. Such prosecutions can be very difficult, especially in the context of the Internet, where community standards, jurisdictional, and international issues abound. Nevertheless, vigorous enforcement may help to persuade operators in the adult online industry to change their behavior to act more responsibly in denying access to children. It may also reduce to some extent the number of operators of Web sites carrying obscene material by putting some of them out of business and changing the cost-benefit calculus for other "marginal" operators who would choose to exit the business in a different regulatory climate. Note that a reduction in the number of Web site operators providing such materials is not likely to reduce greatly the ease with which one can find such material. The reason is that search engines operate in such a way that a given search is more or less independent of the number of such sites. Thus, if there are 100,000 such sites today, a reduction to 10,000 sites would not significantly increase the difficulty of finding such material (or even to reduce the likelihood of coming across it inadvertently).1 Thus, it is likely that the second effect--persuading operators to behave more responsibly--would be larger in magnitude. It is also important to note that the problem of defining which community's standards should govern obscenity prosecutions for material on the Internet is currently pending before the Supreme Court of the United States in Ashcroft v. ACLU. In that case, the United States Court of Appeals for the Third Circuit held unconstitutional the Child Online Protection Act of 1998 (COPA) because it would be unconstitutional to judge Internet material that is allegedly obscene for minors on the basis of local community standards. The Court of Appeals reasoned that in dealing with the Internet, unlike other forms of communication, the material is immediately available in all jurisdictions without regard to conventional geographical boundaries. If local community standards govern the definition of material that is obscene for minors, then providers will be liable to criminal prosecution if the material they make available violates the standards of any jurisdiction in the nation. In such a situation, providers will censor themselves by providing only that sexually oriented material that is not obscene for minors in the most conservative community in the nation. The Court of Appeals concluded that such a situation would violate the First Amendment rights of both providers and of citizens of all of the other communities in the nation. This same issue would arise with respect to traditional obscenity prosecutions for material presented on the Internet. It should also be noted that even if the Supreme Court upholds the Third Circuit Court of Appeals in this instance, this does not necessarily mean that all legislation intended to serve the same goals as COPA will be unconstitutional. For example, one possible solution to this problem would be to require the government in a prosecution of obscenity on the Internet to prove that the material is obscene (or obscene for minors) under both national and local community standards. The use of the national standard would avoid the problem that concerned the Third Circuit Court of Appeals, and the use of the local standard would be necessary to ensure that the material was in fact unprotected in the particular jurisdiction in which the prosecution takes place. In any event, this issue is currently pending. More vigorous prosecution of federal and state obscenity laws--regardless of the outcome--would help to clarify whether the current state of affairs with respect to obscenity prosecutions is due to a liberalization in "community standards," a lack of willingness to undertake prosecutions, or some other factor or factors. Thus, such prosecution could help to establish more up-to-date benchmarks for the field, a development that would provide much-needed guidance for law enforcement professionals dealing with such issues. Finally, vigorous enforcement of laws against false and deceptive advertising and business practices (an example of which is described in Section 3.4.2) could help to reduce exposure to inappropriate sexually explicit material that results from mousetrapping, takeovers of spam that advertises adult-oriented, sexually explicit sites, and so on. Finally, note also that despite the Supreme Court ruling overturning the provisions of the Child Pornography Prevention Act relating to computer-generated imagery (discussed in Chapter 4), there is no bar to the prosecution of material that is obscene, whether or not it involves computer-generated images. Thus, if material depicts a child engaged in sexual activity, the full weight of the obscenity laws continues to apply should that material be found--through the Miller tests--to be obscene.
Because prosecutors have not been inclined in recent years (i.e., throughout most of the 1990s) to commit substantial resources to obscenity prosecutions, an alternative is to allow private individuals to bring civil actions for damages against individuals or businesses that purvey obscenity on the Internet. Under such a regime, any person who finds obscenity on the Internet could sue the Web site operator for civil damages. This use of the concept of "private attorneys general" is not unknown in the law. But it is exceptionally rare. Ordinarily, one cannot bring a civil action for damages without showing some legally cognizable harm to the would-be plaintiff that has not been suffered by other persons generally. For example, if X drives his car in excess of the speed limit, he cannot be sued for damages by people who were not individually damaged by his speeding. Similarly, people cannot sue for damages a murderer, a thief, or a drug dealer, without a showing of particularized, specific harm to them as individuals. Although the idea of essentially creating a "bounty" by authorizing such suits has some appeal, it is generally not consistent with the standards of the U.S. legal system or the basic goals of U.S. system for civil actions. And although civil actions for damages are familiar in the realm of expression (for example, civil actions for defamation), they have always required a showing of individualized harm. A further difficulty is the potential for abuse--using the court system merely to harass providers of material that one group or another finds objectionable.
In recognition of the special problems posed by the exposure of children to sexually explicit material, the Supreme Court held in Ginsberg v. New York that the government can constitutionally prohibit "the sale to minors . . . of material defined to be obscene on the basis of its appeal to them whether or not it would be obscene to adults." In other words, the government can prohibit children from having access to certain types of sexually explicit material that it cannot constitutionally ban for adults. As noted in Chapter 4, this doctrine works best in those situations in which it is possible to separate children from adults, for as the Supreme Court has also observed, the government "may not reduce the adult population . . . to reading only what is fit for children." Thus, in decisions like Reno v. ACLU, the Court has made clear that the government may not prohibit material that is "obscene as to minors" on the Internet unless it can do so in a way that does not unduly interfere with the rights of adults to have access to such material. Further, there is a very wide developmental range from birth to the age of legal majority. The very concept of speech that is obscene for minors has never been well defined, but presumably its content varies with the age of the minor. This creates a problem for any unitary definition for the term. Furthermore, what is obscene, either for adults or for children, turns in part on community standards. But as noted in Chapter 4, it is difficult to define or identify community standards when one deals with the Internet. This may, or may not, present a constitutional problem depending on the restrictions that are imposed on such material (see below). Although an outright prohibition of material that is obscene for minors would therefore be unconstitutional, more finely tuned proposals, such as those described below, may pass constitutional muster.
In the Child Online Protection Act of 1998 (COPA), Congress sought to remedy the deficiencies in the Communications Decency Act of 1997 (CDA) that led the Supreme Court unanimously to invalidate the CDA in Reno v. ACLU. (See Chapter 4.) COPA declares it unlawful to communicate on a commercial Web site material that is obscene for minors by means of the World Wide Web if the material is available to minors. COPA states that it will be an affirmative defense if the defendant, in good faith, takes reasonable measures to restrict access to the material by minors by requiring a credit card, an adult access code, an adult personal identification number, or other appropriate proof of age. As of April 2002, the constitutionality of COPA is currently pending before the Supreme Court of the United States in Ashcroft v. ACLU. As noted above, one issue in this case is whether the use of local community standards in the context of the Internet is consistent with the First Amendment. Another issue is whether the requirement of age verification passes constitutional muster. For example, in a book store one can require proof of age to purchase a book that is obscene for minors, thus providing access to adults while denying access to children. COPA attempts to establish a similar basis for differentiation between adults and minors on the Internet. Because the implementation of age verification on the Internet requires the use of technology, the objection to COPA is that it imposes significant costs on the Web site operator and/or the adult viewers and that by potentially creating a permanent record it violates legitimate privacy interests and chills the freedom of adult viewers. If the Supreme Court upholds the constitutionality of COPA in Ashcroft v. ACLU, this will appreciably advance the interests of those who seek to prevent minors from gaining access to material that is deemed to be obscene for minors. It will not necessarily meet all of their concerns, however. First, COPA applies only to material that is obscene for minors. The precise definition of this concept remains largely undeveloped, and it is not clear how far it will reach. Second, COPA applies only to material on the World Wide Web. It does not apply to chat rooms or e-mail. Third, COPA applies only to commercial Web sites. It does not apply to noncommercial sites. These three limitations on COPA were necessary to meet the concerns of the Supreme Court in Reno v. ACLU in invalidating the CDA. Fourth, COPA will be effective only to the extent government actually prosecutes violations with sufficient vigor to have a significant deterrent effect. The lack of Internet obscenity prosecutions in recent years raises questions about whether such prosecutions will occur. Fifth, COPA applies only to Web sites in the United States. For jurisdictional reasons, federal legislation cannot readily govern Web sites outside the United States, even though they are accessible within the United States. Because a substantial percentage of sexually explicit Web sites exist outside the United States, even the strict enforcement of COPA will likely have only a marginal effect on the availability of such material on the Internet in the United States. Thus, even if the Supreme Court upholds COPA, COPA is not a panacea, illustrating the real limitations of policy and legal approaches to this issue. The committee also notes that, even if COPA is constitutional, this does not necessarily mean it is good public policy. The concerns raised against COPA could at least arguably lead to the conclusion that it is insufficiently effective to justify its costs, whether or not it is consistent with the First Amendment. If the Supreme Court invalidates COPA because age verification procedures in the context of the Internet are too burdensome on the First Amendment rights of adults, this will make it very difficult to regulate material that is obscene for minors in this environment. In the next few sections, the committee presents several legal and regulatory approaches that might be available even if the Supreme Court invalidates COPA.
Many commercially oriented adult Web sites subject the viewer to an assortment of "teaser" images that are intended to entice a viewer to pay in order to see more such images. In many cases, the teaser images include material that may be obscene. To prevent minors from viewing such materials, it might be possible to grant such Web sites a statutory "safe harbor" immunity from prosecution under obscenity laws if the provider places the Web site behind a "plain brown wrapper."2 Such a "notice" page would contain an appropriate warning indicating that going past this notice page should be done only if the viewer is older than 18, and that going past this notice page constitutes certification that the user is indeed older than 18.3 The notice page would contain no images, or perhaps images that are obscured in the same way that the covers of adult-oriented magazines are obscured in the newsstands. The purpose of the notice page is to ensure that anyone who reaches the sexually explicit Web pages of a site has actually read the notice and agreed to its terms. However, many sites today have notice pages and it is still often possible to reach sexually explicit pages on those sites through search engines that index the pages behind the notice page. Thus, by clicking on a link that may be returned by a search engine, the user circumvents the notice page entirely. To prevent such circumvention, it is necessary to prevent search engines from indexing the pages behind the notice page, and a standard protocol for accomplishing this task is described in Chapter 2 (Box 2.3). For a site that uses this protocol (the "robots.txt" protocol), a Web indexer for a search engine cannot reach the pages behind the notice page, and so search engines cannot return links to those pages and thus users cannot access them directly. Thus, the only way that a user could reach the contents of the adult Web site would be to go through the notice page. This approach would reduce inadvertent access to teaser images on adult-oriented sites, and thus provide a greater level of denial of such access than is currently in place. Of course, this approach would not prevent access by individuals under 18 who are willing to lie about their age. To deal with such individuals, it may be possible to add to the plain brown wrapper an age verification requirement. That is, in order to get past the notice page, proof of age would be required.4 (A discussion of age verification technologies is contained in Chapter 13.) Such a provision might be constitutional, even if COPA is declared invalid, because the use of age verification is encouraged by the offer of immunity from prosecution for obscenity, but is not legally required.
Another possibility would be to require any commercial provider of material that is obscene for minors to label that speech in a machine-readable manner that enables parents and others to technologically block the access of minors to such material (Section 12.1 has further discussion of this approach). Because this approach focuses only on a category of speech that can constitutionally be restricted for minors, and does not prohibit adults (or even minors) from accessing it, it may not be unduly burdensome. If the market responds appropriately, the proposal provides parents with a reasonable degree of control over what their children see. If the labeling requirement is found to present constitutional difficulties, a less speech-restrictive approach would be to grant safe-harbor immunity from prosecution under "obscene for minors" and obscenity laws for those who voluntarily label their materials in an appropriate manner.
A third approach would be to prohibit any person from sending on the Internet commercial spam that includes material that is obscene for minors. This is less intrusive on First Amendment interests than COPA because it deals only with commercial advertising and it involves sending information to individuals who have not requested it or sought it out. However, this approach is potentially more problematic than COPA because it restricts the sending of such material to adults as well as to children. Unlike COPA, it does not attempt to differentiate between them. In general, the Supreme Court has held that the government cannot prohibit the sending of constitutionally protected material (including commercial advertising) to individuals who have not expressly asked not to receive it. In Bolger v. Youngs Drugs Products Corp,5 for example, the Court invalidated a federal statute prohibiting the mailing of unsolicited advertisements for contraceptives because the interest in shielding "recipients of mail from materials that they are likely to find offensive" is not sufficiently substantial to justify the restriction of "protected speech." The most plausible distinction between the law invalidated in Bolger and a ban on sending through the Internet commercial spam that includes material that is obscene for minors is that material that is obscene for minors is constitutionally protected for adults, but not for children. This may not be a sufficient distinction to make a constitutional difference. More modest versions of this proposal, more likely to withstand constitutional scrutiny, would prohibit any person from sending commercial spam that includes material that is obscene for minors (a) without appropriate labeling (e.g., having a warning on the e-mail subject containing a message like "Not appropriate for children under 16 years of age"), or (b) without prior age verification, or (c) after the recipient has objected to receiving such material in the past. It is important to note that for speech to be regulated under the commercial speech doctrine, it must consist of advertising. Thus, to the extent that the constitutionality of the alternatives noted above turns on the commercial speech doctrine, non-commercial spam or spam that does not consist of advertising could not be restricted. It is also worth noting that most spam concerning sexually explicit material does not consist of the sexually explicit material itself, but of links to Web sites that have such material embedded within them. Thus, the recipient of the e-mail must affirmatively take some action actually to reach the Web site (e.g., clicking on the link). From a constitutional perspective, there is a significant difference between "inflicting" sexually explicit material on individuals who do not want to be exposed to it and providing those individuals information about how to find such material. Even if the former can be regulated, the latter may warrant greater constitutional protection. This observation may be especially important in applying the commercial speech doctrine. From a constitutional perspective, there is a difference between giving individuals information about how to obtain an unlawful thing and actually providing them with the thing. Making illegal the mere providing of the link might pass constitutional muster if (a) the material at the link could be determined to be illegal (as it could sometimes be under obscenity laws) and (b) the party providing the link is essentially an accomplice under the criminal law. Requirement (b) would not be met merely because someone provided information about how to find obscene material. However, it would be met if the spammer is also the operator of the Web site containing obscene materials (or if the spammer is hired by the Web site operator), because the spam could be regarded as an advertisement for an illegal product and the provider/sender punished on that basis. Note that a variety of legislative proposals have appeared with the intent of reducing the problem of spam e-mails. For example, one federal proposal calls for prohibiting senders of unsolicited commercial electronic mail from disguising the source of their messages, and giving consumers the choice to cease receiving a sender's unsolicited commercial electronic mail messages.6 This proposed legislation prohibited senders from including materially false or misleading header information and deceptive subject headings in commercial e-mail, required the inclusion of a valid return address in commercial electronic mail so that the recipient could indicate a desire not to receive further messages from the sender, and penalized further transmissions of e-mail once such a desire had been indicated. States have also sought to regulate spam, as illustrated in .
Still another possibility for regulating spam is a mechanism similar to that for regulating the telephone calls of telemarketers.7 Thus, it might be feasible for a central clearinghouse to register and maintain specific e-mail addresses or perhaps even entire domain names as a "do not spam" database. The clearinghouse would also be configured to provide database results automatically. Any party sending spam would be required to check the "do not spam" database and eliminate from its mass mailing all addresses contained in the database. Not doing so (and the proof would be the receipt of a spam by someone contained in the database) would subject the sender to some civil liability and/or class-action suit. (Note that a mechanism of this sort that was specifically aimed at senders of sexually explicit spam would be much more suspect under the First Amendment because it would not be content-neutral.)
Another approach may be to prohibit the practice of mousetrapping at sites that contain material that is obscene for minors without prior age verification. Even if the Court finds COPA unconstitutional, it may be that the act of directing children to material that is obscene for minors without their consent or any affirmative act on their part would be upheld. The act of mousetrapping involves not only exposing individuals to material they would prefer to avoid, but rather actually taking over their freedom of choice and effectively compelling them to view such material. Whether or not all mousetrapping can or should be restricted, a reasonable case can be made for prohibiting operators of Web sites from sending children without warning or choice to sites that will expose them to material that is obscene for minors. A more modest variation would be to require at least a warning before mousetrapping a viewer to a site containing material that is obscene for minors.
One element of the federal obscenity laws (18 U.S.C. 2257, as discussed in Section 4.2.2) involves a record-keeping requirement intended to ensure that performers and models depicted in sexually explicit scenes are older than 18. More active enforcement of this provision may better protect minors from participation in the creation of child pornography.8 Assuming that strict enforcement of this provision can withstand constitutional scrutiny, such enforcement might also have the effect of increasing the rate at which the adult online Web industry consolidates. Compliance with the regulation would increase the expenses of such providers, and would be likely to drive out of business the small-scale "quick buck" enterprises, while the established adult content providers would simply absorb those expenses as a cost of doing business. (At the same time, as a matter of constitutional doctrine, the intent behind enforcement is highly relevant. If these laws are enforced with the intent of driving out of business otherwise legal business operations, such enforcement might well raise constitutional questions.) As described in Chapter 3, representatives from the online adult industry testified to the committee that attracting children was not in their business interest. Taking this testimony at face value, and assuming that the "quick buck" providers are the ones that are not discriminating in their attraction of traffic to their sites (i.e., not distinguishing between adults and minors), then enforcement of 18 U.S.C. 2257 might result in a withering of the irresponsible enterprises, leaving behind businesses that take more seriously their responsibilities to deny access of their wares to minors. Note also that the issue of recognizing ''sexually explicit conduct'' is far simpler than recognizing obscenity, a fact that would simplify considerably the task of prosecution.9
Prosecutors seeking to enforce child pornography laws rely to a significant extent on lay person reporting of child pornography. That is, law enforcement officials may come across such material in the course of their everyday work, but citizens filing complaints about such material are a major source of leads and tips. In an Internet environment, the most natural way to file such complaints is likely to be electronic. For example, a concerned citizen lodging a complaint with law enforcement officials should provide the route to which the material came to the citizen's attention and a description of the image. Because images are hard to describe in words, a copy of the image would be desirable to include in the complaint. Indeed, in the case of child pornography, such a copy so forwarded might be the only tangible evidence that law enforcement officials could obtain, as child pornography sites are generally highly transient (and by the time law enforcement officials are able to act on such complaints, the site may be gone). However, if the citizen files an electronic complaint with a copy of the suspect image, he or she may be in violation of statutes that prohibit the electronic transmission or distribution of child pornography, even though it is being transmitted to law enforcement authorities or the National Center for Missing and Exploited Children (NCMEC) and even though such evidence might be crucial for the investigation and prosecution of the offender by law enforcement. Instead, complainants must often go through a cumbersome and inconvenient procedure to file such a report.10 A similar problem affects the NCMEC. Despite the NCMEC's role in providing technical assistance to law enforcement in the investigation of child pornography, it does not enjoy the same immunity enjoyed by law enforcement authorities to receive, possess, and store complaints of child pornography, and it does not have the authorization to transfer evidence of child pornography from NCMEC to other designated law enforcement agencies outside the CyberTipline (CTL) system and sometimes even within the CTL system.11 Relief for the first problem--i.e., allowing citizens to report suspected child pornography to the NCMEC without fear of prosecution--could enable and encourage more citizen action in this area, while relief for the second problem would enable the NCMEC to proactively forward such evidence to law enforcement outside the CTL system.
Successful self-regulatory approaches are based on the fact that the firms in an industry are generally willing to abide by a common code of behavior, though as noted in Chapter 4, such a willingness may reflect a desire to stave off legislation or other regulation that these firms would find more onerous. One example of self-regulation with respect to certain media content is the willingness of private producers of TV content to provide ratings of their content that can be processed by the V-chip. Today, a large number of reported instances of child pornography remain on Internet service provider (ISP) servers because law enforcement lacks the resources to investigate every report. An approach used in Europe with some success is employed by the European INHOPE Hotlines. Under the INHOPE approach, European ISPs support a non-governmental organization, staffed by trained specialists to identify child pornography and funded by the ISPs, whose role is to advise Internet service providers of possible postings of child pornography.12 (Through Internet "Hotlines," this organization takes tips from the public, but screens them for credibility.) Such advisories do not have any binding effect on ISPs, but in fact many ISPs cooperate with such advisories by taking down the offending material because these advisories provide more authoritative advice than that provided by members of the public.13 In a U.S. context, such a function could be provided by the NCMEC, which currently lacks the authority to provide such advisories. A second facet of possible self-regulatory efforts might include prominent placement of the CTL reporting icon on adult-oriented Web sites. Today, many such Web sites provide links to information on filtering products, and some even have a banner that says "fight child pornography." It would be simple for these sites to add the CTL icon, which has proven quite useful in reporting online child pornography. Another dimension of self-regulation is the willingness of ISPs to enforce the terms of service to which users must agree. To the extent that these terms of service prohibit posting or sending of inappropriate material, harassment, or other inappropriate behavior (and most terms of service do contain some restrictions along these lines), ISPs have the authority to take quick action against offending users without waiting for legal action. A third example of self-regulation could be set by the commercial sources of adult-oriented, sexually explicit imagery that provide much of the content for smaller "affiliates." In particular, they could build into their contracts with affiliates conditions that require those affiliates to engage in responsible behavior. Thus, as one possibility, affiliates could be required contractually to put their content behind the Internet equivalent of "plain brown wrappers" with age verification. The firms that supply them with content would be in a position to check on them and penalize them if they did not (by cutting off a content source).
In its consideration of various public policy options to help shield children and youth from inappropriate sexually explicit material on the Internet, the committee realizes that the viability of many proposals depends on how makers of public policy make certain trade-offs. Proposals that depend on regulating a certain type of content (namely, sexually explicit speech) are inherently more suspect on First Amendment grounds than proposals that regulate speech independent of content. For example, the committee believes that spam containing material that is obscene for minors should not be sent to children. But laws banning such e-mail to minors are potentially problematic in an online environment in which it is very difficult to differentiate between adults and minors. At the same time, a ban of all spam regardless of content may be seen as too broad because it affects many other interests. The committee also believes that it would be desirable for adult Web site operators who exhibit material that is obscene for minors to use age verification systems so that children would not be able to access such material. However, in an online environment in which it is very difficult to differentiate between adults and minors, it is not clear whether this can be achieved in a way that does not unduly constrain the viewing rights of adults. Thus, as one illustrative example, the government might offer a grant of immunity from prosecution under obscenity laws to Web site operators who use age verification systems to prevent minors from accessing such material. In this instance, the trade-off is helping to protect children from exposure to certain kinds of inappropriate sexually explicit material in return for limitations on possible obscenity prosecutions. Enforcement of obscenity laws also presents trade-offs. Increased prosecution of obscenity would likely require increased resources, and those resources must be taken from some other activity. If, as is likely, the other activity represents prosecutions of other crimes, policy makers must make the judgment that it would be wise to pursue more obscenity prosecutions rather than other criminal prosecutions, or that more prosecutions for obscenity would necessarily be the best use of additional resources--if such resources are available. Such judgments are complex and require a careful weighing of many competing factors well beyond the scope of this report. Several other general observations follow below:
Box
Table
Notes1 Strictly speaking, the results of a search do depend on the number of Web pages being searched: the more pages in the index, the less often the search engine re-indexes/re-searches the Net for updates and changes. Thus, Web pages that have recently been added to the Internet are less likely to be found when the number of pages in the index is large. However, this is a relatively small effect. 2 The reason for this approach (of granting immunity from prosecution under obscenity laws rather than obscene-for-minors laws in exchange for using age verification technologies and "plain brown wrappers") is that if it were constitutional to prosecute a Web site operator under obscene-for-minors laws, the government would simply do it. However, if such prosecutions are found to be unconstitutional, then the Web site is immune from such prosecutions regardless of what it does with respect to age verification technologies and "plain brown wrappers," and thus an incentive of a different sort is needed to persuade them to adopt such measures. In this case, the incentive is to grant another, different benefit--namely, immunity from obscenity prosecution. 3 The age of 18 is an age that denotes legal emancipation for a minor, but there is no particular reason that the age could not be some other number. 4 The use of age verification technologies poses a significant privacy issue. Indeed, in the cases of the CDA and COPA, the courts reviewing these acts found that these requirements were unreasonable given the current state of technology and that age verification measures impose significant burdens on Web sites because the verification measures require site visitors to provide personal information. Because users are reluctant to provide this information and are discouraged from accessing sites that require such disclosures, the imposition of age verification requirements may chill or inhibit adults from accessing non-obscene Web sites, both because they might not wish to give personal information and because they may not be able to prove their age. These measures, the courts found, would diminish access to protected speech and thereby impose significant expense on commercial sites. 5 463 U.S. 60 (1983). 6 HR 718, the Unsolicited Commercial Electronic Mail Act of 2001, passed the House on April 4, 2001. 7 At present, a customer request to refrain from calling must be honored only by the specific company to which the customer has given notice. As this report goes to press, the Federal Trade Commission is proposing to create a centralized national "Do Not Call" registry that would enable consumers to eliminate most telemarketing calls simply by registering with a central "do-not-call" list maintained by the FTC. See <http://www.ftc.gov/opa/2002/01/donotcall.htm>. 8 This point is likely to be most relevant in the context of Web sites that depict "barely legal" models engaged in sexually explicit behavior. Thus, the child pornography at issue is most likely to be images of an older minor engaged in sexual activity. 9 18 U.S.C. 2257 defines ''sexually explicit conduct'' as actual sexual intercourse, including genital-genital, oral-genital, anal-genital, or oral-anal, whether between persons of the same or opposite sex; bestiality; masturbation; sadistic or masochistic abuse; or lascivious exhibition of the genitals or pubic area of any person. 10 For example, to report a suspected violation through the NCMEC, they must provide the relevant URL where the image can be found and a textual description of the image. In an online environment, it would be much simpler and easier for the citizen to simply forward the image. 11 The CyberTipline, operated by the NCMEC, is a national reporting mechanism for use by citizens to report to law enforcement authorities apparent instances of sexual exploitation of children; possession, manufacture, and distribution of child pornography; online enticement of children for sexual acts; child prostitution; child-sex tourism; and child sexual molestation. See <http://www.cybertipline.com> for more information. 12 Note also that the fact of private support by the ISPs is the key component of the "self-regulatory" dimension of this approach as viewed by the Council of Europe. 13 Note that the private terms of service to which users must conform as a condition of an ISP's service agreement with the user grant ISPs considerably more latitude in the exercise of such "take-down" authority than would be possible if they were agents of government and hence constrained by legal and constitutional barriers.
|
![]() |
|||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||
![]() |
||||||||||||||||||||||||||||||||||||||||||||||
![]() |
![]() |
|
||||||||||||||||||||||||||||||||||||||||||||