Internet Law 2006

Brooklyn Law School

Professor Wendy Seltzer, email wendy.seltzer@brooklaw.edu
Visiting Assistant Professor of Law, Brooklyn Law School
Fellow, Berkman Center for Internet & Society at Harvard Law School

Online Speech and How to Stop It

August 31, 2006

Introduction
Syllabus
Wiki
Useful Links
Current Assignment

Reading Notes

Reno v. American Civil Liberties Union, 521 U.S. 844 (1997) (required excerpt follows notes) was the Supreme Court's first encounter with the Internet, when it heard a challenge to the Communications Decency Act, Congress's first attempt to keep material "harmful to minors" off the Net. The Court called the Net "a unique and wholly new medium of worldwide human communication" with content "as diverse as human thought."

Questions for thought: How does the Court see the Internet? What is the Internet like or unlike? How did the CDA attempt to control speech on the Internet? Why do Justice O'Connor and the Chief Justice dissent, in part? How would they permit Congress to control Internet speech?

The cases below deal with other attempts to regulate "harmful" speech. What are some of the options and obstacles? How would the different regulations affect Internet architecture? John Gilmore is often quoted as saying "The Internet interprets censorship as damage and routes around it." What kind of "routing around" might these regulations induce?

New York adopted a "state CDA" prohibiting the online communication of speech "harmful to minors." In American Library Association v. Pataki, 969 F.Supp.160 (S.D.N.Y. 1997), the district court struck down the New York law on Commerce Clause grounds: "[T]he Internet is one of those areas of commerce that must be marked off as a national preserve to protect users from inconsistent legislation that, taken to its most extreme, could paralyze development of the Internet altogether. Thus, the Commerce Clause ordains that only Congress can legislate in this area, subject, of course, to whatever limitations other provisions of the Constitution (such as the First Amendment) may require."

Stymied in its attempt at regulating online speakers, Congress came back in 1999 with the Children's Internet Protection Act, CIPA. CIPA, excerpted here, made schools and libraries filter the Internet access they made avaiable to both minors and adults (CIPA used the lever of federal funding to enforce the mandate).

''SEC. 3601. LIMITATION ON AVAILABILITY OF CERTAIN FUNDS FOR
SCHOOLS.
``(a) INTERNET SAFETY.--
``(1) IN GENERAL.--No funds made available under this title to a local
educational agency for an elementary or secondary school that does not receive services at
discount rates under section 254(h)(5) of the Communications Act of 1934, as added by
section 1721 of Children's Internet Protection Act, may be used to purchase computers used to
access the Internet, or to pay for direct costs associated with accessing the Internet, for
such school unless the school, school board, local educational agency, or other authority
with responsibility for administration of such school both--
``(A)(i) has in place a policy of Internet safety for minors that
includes the operation of a  technology protection measure with respect to any of its computers
 with Internet access that protects against access through such computers to visual depictions
that are--
``(I) obscene;
``(II) child pornography; or
``(III) harmful to minors; and
``(ii) is enforcing the operation of such technology protection
measure during any
use of such computers by minors; and
``(B)(i) has in place a policy of Internet safety that includes the
operation of a technology  protection measure with respect to any of its computers with Internet
 access that protects against access through such computers to visual depictions that are--
``(I) obscene; or
``(II) child pornography; and
``(ii) is enforcing the operation of such technology protection
measure during any use of such
 computers.

...

``(3) DISABLING DURING CERTAIN USE.--An administrator, supervisor,
 or person authorized by the responsible authority under paragraph 
(1) may disable the technology protection measure concerned 
to enable access for bona fide research or other lawful purposes. 

The ACLU represented the American Library Association, among others, in a challenge to CIPA. As you read United States v. American Library Association, (required excerpt follows notes) 539 U.S. 194 (2003), consider where this regulation differs from the CDA and where it is similar. What is the meaning and relevance of the Court's "public forum" analysis of the law?

The ALA plurality and dissent disagree about the significance of filters' "overblocking." Read about the popular weblog BoingBoing's experience with web filters, BoingBoing banned in UAE, Qatar, elsewhere. BoingBoing responded by posting a guide to defeating censorware.

Connecticut tried regulating child pornography at yet a different level. It authorized the state attorney general's office to force Internet service providers (ISPs) to block Pennsylvania residents' access to sites the AG's office identified as child pornography. The ISPs had to block access through their networks even when there was no claim that the ISPs were responsible for the sites at issue. Please read the Center for Democracy and Technology's fact sheet on CDT v. Pappert, 337 F.Supp.2d 606 (E.D. Penn. 2004), <http://www.cdt.org/speech/pennwebblock/20040915highlights.pdf> Full decision available at <http://www.cdt.org/speech/pennwebblock/20040910memorandum.pdf>, but not required.

These debates continue with vigor. Earlier this month, the House of Representatives passed a bill it called the Deleting Online Predators Act. If enacted, DOPA would require schools and libraries to block minors' access to social networking and chat sites. See Social Network Sites, Blogs, Wikis Fret Over Proposed Regulation. Should the Senate pass this bill?

How should regulators react to harmful speech online?

Reno v. American Civil Liberties Union, 521 U.S. 844 (1997)

[The Communications Decency Act of 1996 (CDA) provided for expedited review of the law’s constitutionality, first by a three-judge district court panel, then directly to the Supreme Court. The district court panel struck down the law as violating the First and Fifth Amendments.]

STEVENS, J., delivered the opinion of the Court, in which SCALIA, KENNEDY, SOUTER, THOMAS, GINSBURG, and BREYER, JJ., joined. O'CONNOR, J., filed an opinion concurring in the judgment in part and dissenting in part, in which REHNQUIST, C. J., joined.
 
At issue is the constitutionality of two statutory provisions enacted to protect minors from "indecent" and "patently offensive" communications on the Internet. Notwithstanding the legitimacy and importance of the congressional goal of protecting children from harmful materials, we agree with the three-judge District Court that the statute abridges "the freedom of speech" protected by the First Amendment.

The District Court made extensive findings of fact, most of which were based on a detailed stipulation prepared by the parties. The findings describe the character and the dimensions of the Internet, the availability of sexually explicit material in that medium, and the problems confronting age verification for recipients of Internet communications. Because those findings provide the underpinnings for the legal issues, we begin with a summary of the undisputed facts.

The Internet

The Internet is an international network of interconnected computers. It is the outgrowth of what began in 1969 as a military program called "ARPANET," which was designed to enable computers operated by the military, defense contractors, and universities conducting defense-related research to communicate with one another by redundant channels even if some portions of the network were damaged in a war. While the ARPANET no longer exists, it provided an example for the development of a number of civilian networks that, eventually linking with each other, now enable tens of millions of people to communicate with one another and to access vast amounts of information from around the world. The Internet is "a unique and wholly new medium of worldwide human communication."

The Internet has experienced "extraordinary growth." The number of "host" computers--those that store information and relay communications--increased from about 300 in 1981 to approximately 9,400,000 by the time of the trial in 1996. Roughly 60% of these hosts are located in the United States. About 40 million people used the Internet at the time of trial, a number that is expected to mushroom to 200 million by 1999.

Individuals can obtain access to the Internet from many different sources, generally hosts themselves or entities with a host affiliation. Most colleges and universities provide access for their students and faculty; many corporations provide their employees with access through an office network; many communities and local libraries provide free access; and an increasing number of storefront "computer coffee shops" provide access for a small hourly fee. Several major national "online services" such as America Online, CompuServe, the Microsoft Network, and Prodigy offer access to their own extensive proprietary networks as well as a link to the much larger resources of the Internet. These commercial online services had almost 12 million individual subscribers at the time of trial.

Anyone with access to the Internet may take advantage of a wide variety of communication and information retrieval methods. These methods are constantly evolving and difficult to categorize precisely. But, as presently constituted, those most relevant to this case are electronic mail ("e-mail"), automatic mailing list services ("mail exploders," sometimes referred to as "listservs"), "newsgroups," "chat rooms," and the "World Wide Web." All of these methods can be used to transmit text; most can transmit sound, pictures, and moving video images. Taken together, these tools constitute a unique medium--known to its users as "cyberspace"--located in no particular geographical location but available to anyone, anywhere in the world, with access to the Internet.

E-mail enables an individual to send an electronic message--generally akin to a note or letter--to another individual or to a group of addressees. The message is generally stored electronically, sometimes waiting for the recipient to check her "mailbox" and sometimes making its receipt known through some type of prompt. A mail exploder is a sort of e-mail group. Subscribers can send messages to a common e-mail address, which then forwards the message to the group's other subscribers. Newsgroups also serve groups of regular participants, but these postings may be read by others as well. There are thousands of such groups, each serving to foster an exchange of information or opinion on a particular topic running the gamut from, say, the music of Wagner to Balkan politics to AIDS prevention to the Chicago Bulls. About 100,000 new messages are posted every day. In most newsgroups, postings are automatically purged at regular intervals. In addition to posting a message that can be read later, two or more individuals wishing to communicate more immediately can enter a chat room to engage in real-time dialogue--in other words, by typing messages to one another that appear almost immediately on the others' computer screens. The District Court found that at any given time "tens of thousands of users are engaging in conversations on a huge range of subjects." It is "no exaggeration to conclude that the content on the Internet is as diverse as human thought."

The best known category of communication over the Internet is the World Wide Web, which allows users to search for and retrieve information stored in remote computers, as well as, in some cases, to communicate back to designated sites. In concrete terms, the Web consists of a vast number of documents stored in different computers all over the world. Some of these documents are simply files containing information. However, more elaborate documents, commonly known as Web "pages," are also prevalent. Each has its own address--"rather like a telephone number." Web pages frequently contain information and sometimes allow the viewer to communicate with the page's (or "site's") author. They generally also contain "links" to other documents created by that site's author or to other (generally) related sites. Typically, the links are either blue or underlined text--sometimes images.

Navigating the Web is relatively straightforward. A user may either type the address of a known page or enter one or more keywords into a commercial "search engine" in an effort to locate sites on a subject of interest. A particular Web page may contain the information sought by the "surfer," or, through its links, it may be an avenue to other documents located anywhere on the Internet. Users generally explore a given Web page, or move to another, by clicking a computer "mouse" on one of the page's icons or links. Access to most Web pages is freely available, but some allow access only to those who have purchased the right from a commercial provider. The Web is thus comparable, from the readers' viewpoint, to both a vast library including millions of readily available and indexed publications and a sprawling mall offering goods and services.

From the publishers' point of view, it constitutes a vast platform from which to address and hear from a world-wide audience of millions of readers, viewers, researchers, and buyers. Any person or organization with a computer connected to the Internet can "publish" information. Publishers include government agencies, educational institutions, commercial entities, advocacy groups, and individuals. Publishers may either make their material available to the entire pool of Internet users, or confine access to a selected group, such as those willing to pay for the privilege. "No single organization controls any membership in the Web, nor is there any centralized point from which individual Web sites or services can be blocked from the Web."

Sexually Explicit Material


Sexually explicit material on the Internet includes text, pictures, and chat and "extends from the modestly titillating to the hardest-core." These files are created, named, and posted in the same manner as material that is not sexually explicit, and may be accessed either deliberately or unintentionally during the course of an imprecise search. "Once a provider posts its content on the Internet, it cannot prevent that content from entering any community." Thus, for example,

"when the UCR/California Museum of Photography posts to its Web site nudes by Edward Weston and Robert Mapplethorpe to announce that its new exhibit will travel to Baltimore and New York City, those images are available not only in Los Angeles, Baltimore, and New York City, but also in Cincinnati, Mobile, or Beijing--wherever Internet users live. Similarly, the safer sex instructions that Critical Path posts to its Web site, written in street language so that the teenage receiver can understand them, are available not just in Philadelphia, but also in Provo and Prague."

Some of the communications over the Internet that originate in foreign countries are also sexually explicit.

Though such material is widely available, users seldom encounter such content accidentally. "A document's title or a description of the document will usually appear before the document itself . . . and in many cases the user will receive detailed information about a site's content before he or she need take the step to access the document. Almost all sexually explicit images are preceded by warnings as to the content." For that reason, the "odds are slim" that a user would enter a sexually explicit site by accident. Unlike communications received by radio or television, "the receipt of information on the Internet requires a series of affirmative steps more deliberate and directed than merely turning a dial. A child requires some sophistication and some ability to read to retrieve material and thereby to use the Internet unattended."

Systems have been developed to help parents control the material that may be available on a home computer with Internet access. A system may either limit a computer's access to an approved list of sources that have been identified as containing no adult material, it may block designated inappropriate sites, or it may attempt to block messages containing identifiable objectionable features. "Although parental control software currently can screen for certain suggestive words or for known sexually explicit sites, it cannot now screen for sexually explicit images." Nevertheless, the evidence indicates that "a reasonably effective method by which parents can prevent their children from accessing sexually explicit and other material which parents may believe is inappropriate for their children will soon be available."

Age Verification
 

The problem of age verification differs for different uses of the Internet. The District Court categorically determined that there "is no effective way to determine the identity or the age of a user who is accessing material through e-mail, mail exploders, newsgroups or chat rooms." The Government offered no evidence that there was a reliable way to screen recipients and participants in such fora for age. Moreover, even if it were technologically feasible to block minors' access to newsgroups and chat rooms containing discussions of art, politics or other subjects that potentially elicit "indecent" or "patently offensive" contributions, it would not be possible to block their access to that material and "still allow them access to the remaining content, even if the overwhelming majority of that content was not indecent."

Technology exists by which an operator of a Web site may condition access on the verification of requested information such as a credit card number or an adult password. Credit card verification is only feasible, however, either in connection with a commercial transaction in which the card is used, or by payment to a verification agency. Using credit card possession as a surrogate for proof of age would impose costs on non-commercial Web sites that would require many of them to shut down. For that reason, at the time of the trial, credit card verification was "effectively unavailable to a substantial number of Internet content providers." Moreover, the imposition of such a requirement "would completely bar adults who do not have a credit card and lack the resources to obtain one from accessing any blocked material."
Commercial pornographic sites that charge their users for access have assigned them passwords as a method of age verification. The record does not contain any evidence concerning the reliability of these technologies. Even if passwords are effective for commercial purveyors of indecent material, the District Court found that an adult password requirement would impose significant burdens on noncommercial sites, both because they would discourage users from accessing their sites and because the cost of creating and maintaining such screening systems would be "beyond their reach."

In sum, the District Court found:

"Even if credit card verification or adult password verification were implemented, the Government presented no testimony as to how such systems could ensure that the user of the password or credit card is in fact over 18. The burdens imposed by credit card verification and adult password verification systems make them effectively unavailable to a substantial number of Internet content providers."

The Telecommunications Act of 1996, Pub. L. 104-104, 110 Stat. 56, was an unusually important legislative enactment. As stated on the first of its 103 pages, its primary purpose was to reduce regulation and encourage "the rapid deployment of new telecommunications technologies." The major components of the statute have nothing to do with the Internet; they were designed to promote competition in the local telephone service market, the multichannel video market, and the market for over-the-air broadcasting. The Act includes seven Titles, six of which are the product of extensive committee hearings and the subject of discussion in Reports prepared by Committees of the Senate and the House of Representatives. By contrast, Title V--known as the "Communications Decency Act of 1996" (CDA)--contains provisions that were either added in executive committee after the hearings were concluded or as amendments offered during floor debate on the legislation. An amendment offered in the Senate was the source of the two statutory provisions challenged in this case. They are informally described  as the "indecent transmission" provision and the "patently offensive display" provision.

The first, 47 U.S.C. A. § 223(a) (Supp. 1997), prohibits the knowing transmission of obscene or indecent messages to any recipient under 18 years of age. It provides in pertinent part:
 

"(a) Whoever--
"(1) in interstate or foreign communications--
. . . . .
"(B) by means of a telecommunications device knowingly--
"(i) makes, creates, or solicits, and
"(ii) initiates the transmission of,
"any comment, request, suggestion, proposal, image, or other communication which is obscene or indecent, knowing that the recipient of the communication is under 18 years of age, regardless of whether the maker of such communication placed the call or initiated the communication;
. . . . .
"(2) knowingly permits any telecommunications facility under his control to be used for any activity prohibited by paragraph (1) with the intent that it be used for such activity,
"shall be fined under Title 18, or imprisoned not more than two years, or both."

The second provision, § 223(d), prohibits the knowing sending or displaying of patently offensive messages in a manner that is available to a person under 18 years of age. It provides:
"(d) Whoever--
"(1) in interstate or foreign communications knowingly--
"(A) uses an interactive computer service to send to a specific person or persons under 18 years of age, or
"(B) uses any interactive computer service to display in a manner available to a person under 18 years of age,
"any comment, request, suggestion, proposal, image, or other communication that, in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs, regardless of whether the user of such service placed the call or initiated the communication; or
"(2) knowingly permits any telecommunications facility under such person's control to be used for an activity prohibited by paragraph (1) with the intent that it be used for such activity,
"shall be fined under Title 18, or imprisoned not more than two years, or both."

The breadth of these prohibitions is qualified by two affirmative defenses. See § 223(e)(5). One covers those who take "good faith, reasonable, effective, and appropriate actions" to restrict access by minors to the prohibited communications. § 223(e)(5)(A). The other covers those who restrict access to covered material by requiring certain designated forms of age proof, such as a verified credit card or an adult identification number or code. § 223(e)(5)(B).

 

In arguing for reversal, the Government contends that the CDA is plainly constitutional under three of our prior decisions: (1) Ginsberg v. New York, 390 U.S. 629 (1968); (2) FCC v. Pacifica Foundation, 438 U.S. 726 (1978); and (3) Renton v. Playtime Theatres, Inc., 475 U.S. 41 (1986). A close look at these cases, however, raises--rather than relieves--doubts concerning the constitutionality of the CDA.

In Ginsberg, we upheld the constitutionality of a New York statute that prohibited selling to minors under 17 years of age material that was considered obscene as to them even if not obscene as to adults. We rejected the defendant's broad submission that "the scope of the constitutional freedom of expression secured to a citizen to read or see material concerned with sex cannot be made to depend on whether the citizen is an adult or a minor." In rejecting that contention, we relied not only on the State's independent interest in the well-being of its youth, but also on our consistent recognition of the principle that "the parents' claim to authority in their own household to direct the rearing of their children is basic in the structure of our society." n31 In four important respects, the statute upheld in Ginsberg was narrower than the CDA. First, we noted in Ginsberg that "the prohibition against sales to minors does not bar parents who so desire from purchasing the magazines for their children." Under the CDA, by contrast, neither the parents' consent--nor even their participation--in the communication would avoid the application of the statute. n32 Second, the New York statute applied only to commercial transactions, whereas the CDA contains no such limitation. Third, the New York statute cabined its definition of material that is harmful to minors with the requirement that it be "utterly without redeeming social importance for minors." The CDA fails to provide us with any definition of the term "indecent" as used in § 223(a)(1) and, importantly, omits any requirement that the "patently offensive" material covered by § 223(d) lack serious literary, artistic, political, or scientific value. Fourth, the New York statute defined a minor as a person under the age of 17, whereas the CDA, in applying to all those under 18 years, includes an additional year of those nearest majority.

In Pacifica, we upheld a declaratory order of the Federal Communications Commission, holding that the broadcast of a recording of a 12-minute monologue entitled "Filthy Words" that had previously been delivered to a live audience "could have been the subject of administrative sanctions." The Commission had found that the repetitive use of certain words referring to excretory or sexual activities or organs "in an afternoon broadcast when children are in the audience was patently offensive" and concluded that the monologue was indecent "as broadcast." The respondent did not quarrel with the finding that the afternoon broadcast was patently offensive, but contended that it was not "indecent" within the meaning of the relevant statutes because it contained no prurient appeal. After rejecting respondent's statutory arguments, we confronted its two constitutional arguments: (1) that the Commission's construction of its authority to ban indecent speech was so broad that its order had to be set aside even if the broadcast at issue was unprotected; and (2) that since the recording was not obscene, the First Amendment forbade any abridgement of the right to broadcast it on the radio.
In the portion of the lead opinion not joined by Justices Powell and Blackmun, the plurality stated that the First Amendment does not prohibit all governmental regulation that depends on the content of speech. Accordingly, the availability of constitutional protection for a vulgar and offensive monologue that was not obscene depended on the context of the broadcast. Relying on the premise that "of all forms of communication" broadcasting had received the most limited First Amendment protection, the Court concluded that the ease with which children may obtain access to broadcasts, "coupled with the concerns recognized in Ginsberg," justified special treatment of indecent broadcasting.

As with the New York statute at issue in Ginsberg, there are significant differences between the order upheld in Pacifica and the CDA. First, the order in Pacifica, issued by an agency that had been regulating radio stations for decades, targeted a specific broadcast that represented a rather dramatic departure from traditional program content in order to designate when--rather than whether--it would be permissible to air such a program in that particular medium. The CDA's broad categorical prohibitions are not limited to particular times and are not dependent on any evaluation by an agency familiar with the unique characteristics of the Internet. Second, unlike the CDA, the Commission's declaratory order was not punitive; we expressly refused to decide whether the indecent broadcast "would justify a criminal prosecution." Finally, the Commission's order applied to a medium which as a matter of history had "received the most limited First Amendment protection," in large part because warnings could not adequately protect the listener from unexpected program content. The Internet, however, has no comparable history. Moreover, the District Court found that the risk of encountering indecent material by accident is remote because a series of affirmative steps is required to access specific material.

In Renton, we upheld a zoning ordinance that kept adult movie theatres out of residential neighborhoods. The ordinance was aimed, not at the content of the films shown in the theaters, but rather at the "secondary effects"--such as crime and deteriorating property values--that these theaters fostered: "'It is the secondary effect which these zoning ordinances attempt to avoid, not the dissemination of "offensive" speech.'" According to the Government, the CDA is constitutional because it constitutes a sort of "cyberzoning" on the Internet. But the CDA applies broadly to the entire universe of cyberspace. And the purpose of the CDA is to protect children from the primary effects of "indecent" and "patently offensive" speech, rather than any "secondary" effect of such speech. Thus, the CDA is a content-based blanket restriction on speech, and, as such, cannot be "properly analyzed as a form of time, place, and manner regulation." See also Boos v. Barry, 485 U.S. 312, 321 (1988)  ("Regulations that focus on the direct impact of speech on its audience" are not properly analyzed under Renton); Forsyth County v. Nationalist Movement, 505 U.S. 123 (1992) ("Listeners' reaction to speech is not a content-neutral basis for regulation").

These precedents, then, surely do not require us to uphold the CDA and are fully consistent with the application of the most stringent review of its provisions.

In Southeastern Promotions, Ltd. v. Conrad, 420 U.S. 546 (1975), we observed that "each medium of expression . . . may present its own problems." Thus, some of our cases have recognized special justifications for regulation of the broadcast media that are not applicable to other speakers, see Red Lion Broadcasting Co. v. FCC, 395 U.S. 367 (1969); FCC v. Pacifica Foundation, 438 U.S. 726 (1978). In these cases, the Court relied on the history of extensive government regulation of the broadcast medium, see, e.g., Red Lion, 395 U.S. at 399-400; the scarcity of available frequencies at its inception, see, e.g., Turner Broadcasting System, Inc. v. FCC, 512 U.S. 622, 637-638 (1994); and its "invasive" nature, see Sable Communications of Cal., Inc. v. FCC, 492 U.S. 115, 128 (1989).

Those factors are not present in cyberspace. Neither before nor after the enactment of the CDA have the vast democratic fora of the Internet been subject to the type of government supervision and regulation that has attended the broadcast industry. Moreover, the Internet is not as "invasive" as radio or television. The District Court specifically found that "communications over the Internet do not 'invade' an individual's home or appear on one's computer screen unbidden. Users seldom encounter content 'by accident.'" It also found that "almost all sexually explicit images are preceded by warnings as to the content," and cited testimony that "'odds are slim' that a user would come across a sexually explicit sight by accident."

We distinguished Pacifica in Sable, on just this basis. In Sable, a company engaged in the business of offering sexually oriented prerecorded telephone messages (popularly known as "dial-a-porn") challenged the constitutionality of an amendment to the Communications Act that imposed a blanket prohibition on indecent as well as obscene interstate commercial telephone messages. We held that the statute was constitutional insofar as it applied to obscene messages but invalid as applied to indecent messages. In attempting to justify the complete ban and criminalization of indecent commercial telephone messages, the Government relied on Pacifica, arguing that the ban was necessary to prevent children from gaining access to such messages. We agreed that "there is a compelling interest in protecting the physical and psychological well-being of minors" which extended to shielding them from indecent messages that are not obscene by adult standards but distinguished our "emphatically narrow holding" in Pacifica because it did not involve a complete ban and because it involved a different medium of communication. We explained that "the dial-it medium requires the listener to take affirmative steps to receive the communication." "Placing a telephone call," we continued, "is not the same as turning on a radio and being taken by surprise by an indecent message."

Finally, unlike the conditions that prevailed when Congress first authorized regulation of the broadcast spectrum, the Internet can hardly be considered a "scarce" expressive commodity. It provides relatively unlimited, low-cost capacity for communication of all kinds. The Government estimates that "as many as 40 million people use the Internet today, and that figure is expected to grow to 200 million by 1999." This dynamic, multifaceted category of communication includes not only traditional print and news services, but also audio, video, and still images, as well as interactive, real-time dialogue. Through the use of chat rooms, any person with a phone line can become a town crier with a voice that resonates farther than it could from any soapbox. Through the use of Web pages, mail exploders, and newsgroups, the same individual can become a pamphleteer. As the District Court found, "the content on the Internet is as diverse as human thought." We agree with its conclusion that our cases provide no basis for qualifying the level of First Amendment scrutiny that should be applied to this medium.

Regardless of whether the CDA is so vague that it violates the Fifth Amendment, the many ambiguities concerning the scope of its coverage render it problematic for purposes of the First Amendment. For instance, each of the two parts of the CDA uses a different linguistic form. The first uses the word "indecent," 47 U.S.C. A. § 223(a) (Supp. 1997), while the second speaks of material that "in context, depicts or describes, in terms patently offensive as measured by contemporary community standards, sexual or excretory activities or organs," § 223(d). Given the absence of a definition of either term, this difference in language will provoke uncertainty among speakers about how the two standards relate to each other and just what they mean. Could a speaker confidently assume that a serious discussion about birth control practices, homosexuality, the First Amendment issues raised by the Appendix to our Pacifica opinion, or the consequences of prison rape would not violate the CDA? This uncertainty undermines the likelihood that the CDA has been carefully tailored to the congressional goal of protecting minors from potentially harmful materials.
 

The vagueness of the CDA is a matter of special concern for two reasons. First, the CDA is a content-based regulation of speech. The vagueness of such a regulation raises special First Amendment concerns because of its obvious chilling effect on free speech. Second, the CDA is a criminal statute. In addition to the opprobrium and stigma of a criminal conviction, the CDA threatens violators with penalties including up to two years in prison for each act of violation. The severity of criminal sanctions may well cause speakers to remain silent rather than communicate even arguably unlawful words, ideas, and images. As a practical matter, this increased deterrent effect, coupled with the "risk of discriminatory enforcement" of vague regulations, poses greater First Amendment concerns than those implicated by the civil regulation reviewed in Denver Area Ed. Telecommunications Consortium, Inc. v. FCC, 518 U.S. __ (1996).

The Government argues that the statute is no more vague than the obscenity standard this Court established in Miller v. California, 413 U.S. 15 (1973). But that is not so. In Miller, this Court reviewed a criminal conviction against a commercial vendor who mailed brochures containing pictures of sexually explicit activities to individuals who had not requested such materials. Having struggled for some time to establish a definition of obscenity, we set forth in Miller the test for obscenity that controls to this day:

"(a) whether the average person, applying contemporary community standards would find that the work, taken as a whole, appeals to the prurient interest; (b) whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and (c) whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value."

Because the CDA's "patently offensive" standard (and, we assume arguendo, its synonymous "indecent" standard) is one part of the three-prong Miller test, the Government reasons, it cannot be unconstitutionally vague.

The Government's assertion is incorrect as a matter of fact. The second prong of the Miller test--the purportedly analogous standard--contains a critical requirement that is omitted from the CDA: that the proscribed material be "specifically defined by the applicable state law." This requirement reduces the vagueness inherent in the open-ended term "patently offensive" as used in the CDA. Moreover, the Miller definition is limited to "sexual conduct," whereas the CDA extends also to include (1) "excretory activities" as well as (2) "organs" of both a sexual and excretory nature.
 
The Government's reasoning is also flawed. Just because a definition including three limitations is not vague, it does not follow that one of those limitations, standing by itself, is not vague. Each of Miller's additional two prongs--(1) that, taken as a whole, the material appeal to the "prurient" interest, and (2) that it "lack serious literary, artistic, political, or scientific value"--critically limits the uncertain sweep of the obscenity definition. The second requirement is particularly important because, unlike the "patently offensive" and "prurient interest" criteria, it is not judged by contemporary community standards. This "societal value" requirement, absent in the CDA, allows appellate courts to impose some limitations and regularity on the definition by setting, as a matter of law, a national floor for socially redeeming value. The Government's contention that courts will be able to give such legal limitations to the CDA's standards is belied by Miller's own rationale for having juries determine whether material is "patently offensive" according to community standards: that such questions are essentially ones of fact. n.

- - - - - - - - - - - - - - Footnotes - - - - - - - - - - - - - - -
n. (Determinations of "what appeals to the 'prurient interest' or is 'patently offensive'. . . . are essentially questions of fact, and our Nation is simply too big and too diverse for this Court to reasonably expect that such standards could be articulated for all 50 States in a single formulation, even assuming the prerequisite consensus exists"). The CDA, which implements the "contemporary community standards" language of Miller, thus conflicts with the Conferees' own assertion that the CDA was intended "to establish a uniform national standard of content regulation." S. Conf. Rep., at 191.
 - - - - - - - - - - - - End Footnotes- - - - - - - - - - - - - -
 
In contrast to Miller and our other previous cases, the CDA thus presents a greater threat of censoring speech that, in fact, falls outside the statute's scope. Given the vague contours of the coverage of the statute, it unquestionably silences some speakers whose messages would be entitled to constitutional protection. That danger provides further reason for insisting that the statute not be overly broad. The CDA's burden on protected speech cannot be justified if it could be avoided by a more carefully drafted statute.

VII
 
We are persuaded that the CDA lacks the precision that the First Amendment requires when a statute regulates the content of speech. In order to deny minors access to potentially harmful speech, the CDA effectively suppresses a large amount of speech that adults have a constitutional right to receive and to address to one another. That burden on adult speech is unacceptable if less restrictive alternatives would be at least as effective in achieving the legitimate purpose that the statute was enacted to serve.
 

 In evaluating the free speech rights of adults, we have made it perfectly clear that "sexual expression which is indecent but not obscene is protected by the First Amendment." Indeed, Pacifica itself admonished that "the fact that society may find speech offensive is not a sufficient reason for suppressing it."
 
It is true that we have repeatedly recognized the governmental interest in protecting children from harmful materials. But that interest does not justify an unnecessarily broad suppression of speech addressed to adults. As we have explained, the Government may not "reduce the adult population . . .  to . . . only what is fit for children." "Regardless of the strength of the government's interest" in protecting children, "the level of discourse reaching a mailbox simply cannot be limited to that which would be suitable for a sandbox." Bolger v. Youngs Drug Products Corp., 463 U.S. 60, 74-75 (1983).

The District Court was correct to conclude that the CDA effectively resembles the ban on "dial-a-porn" invalidated in Sable. In Sable, this Court rejected the argument that we should defer to the congressional judgment that nothing less than a total ban would be effective in preventing enterprising youngsters from gaining access to indecent communications. Sable thus made clear that the mere fact that a statutory regulation of speech was enacted for the important purpose of protecting children from exposure to sexually explicit material does not foreclose inquiry into its validity. As we pointed out last Term, that inquiry embodies an "over-arching commitment" to make sure that Congress has designed its statute to accomplish its purpose "without imposing an unnecessarily great restriction on speech."

In arguing that the CDA does not so diminish adult communication, the Government relies on the incorrect factual premise that prohibiting a transmission whenever it is known that one of its recipients is a minor would not interfere with adult-to-adult communication. The findings of the District Court make clear that this premise is untenable. Given the size of the potential audience for most messages, in the absence of a viable age verification process, the sender must be charged with knowing that one or more minors will likely view it. Knowledge that, for instance, one or more members of a 100-person chat group will be minor--and therefore that it would be a crime to send the group an indecent message--would surely burden communication among adults.

The District Court found that at the time of trial existing technology did not include any effective method for a sender to prevent minors from obtaining access to its communications on the Internet without also denying access to adults. The Court found no effective way to determine the age of a user who is accessing material through e-mail, mail exploders, newsgroups, or chat rooms. As a practical matter, the Court also found that it would be prohibitively expensive for noncommercial--as well as some commercial--speakers who have Web sites to verify that their users are adults. These limitations must inevitably curtail a significant amount of adult communication on the Internet. By contrast, the District Court found that "despite its limitations, currently available user-based software suggests that a reasonably effective method by which parents can prevent their children from accessing sexually explicit and other material which parents may believe is inappropriate for their children will soon be widely available."

The breadth of the CDA's coverage is wholly unprecedented. Unlike the regulations upheld in Ginsberg and Pacifica, the scope of the CDA is not limited to commercial speech or commercial entities. Its open-ended prohibitions embrace all nonprofit entities and individuals posting indecent messages or displaying them on their own computers in the presence of minors. The general, undefined terms "indecent" and "patently offensive" cover large amounts of nonpornographic material with serious educational or other value. Moreover, the "community standards" criterion as applied to the Internet means that any communication available to a nation-wide audience will be judged by the standards of the community most likely to be offended by the message. The regulated subject matter includes  any of the seven "dirty words" used in the Pacifica monologue, the use of which the Government's expert acknowledged could constitute a felony.. It may also extend to discussions about prison rape or safe sexual practices, artistic images that include nude subjects, and arguably the card catalogue of the Carnegie Library.

For the purposes of our decision, we need neither accept nor reject the Government's submission that the First Amendment does not forbid a blanket prohibition on all "indecent" and "patently offensive" messages communicated to a 17-year old--no matter how much value the message may contain and regardless of parental approval. It is at least clear that the strength of the Government's interest in protecting minors is not equally strong throughout the coverage of this broad statute. Under the CDA, a parent allowing her 17-year-old to use the family computer to obtain information on the Internet that she, in her parental judgment, deems appropriate could face a lengthy prison term. See 47 U.S.C. A. § 223(a)(2) (Supp. 1997). Similarly, a parent who sent his 17-year-old college freshman information on birth control via e-mail could be incarcerated even though neither he, his child, nor anyone in their home community, found the material "indecent" or "patently offensive," if the college town's community thought otherwise.

The breadth of this content-based restriction of speech imposes an especially heavy burden on the Government to explain why a less restrictive provision would not be as effective as the CDA. It has not done so. The arguments in this Court have referred to possible alternatives such as requiring that indecent material be "tagged" in a way that facilitates parental control of material coming into their homes, making exceptions for messages with artistic or educational value, providing some tolerance for parental choice, and regulating some portions of the Internet--such as commercial web sites--differently than others, such as chat rooms. Particularly in the light of the absence of any detailed findings by the Congress, or even hearings addressing the special problems of the CDA, we are persuaded that the CDA is not narrowly tailored if that requirement has any meaning at all.

In an attempt to curtail the CDA's facial overbreadth, the Government advances three additional arguments for sustaining the Act's affirmative prohibitions: (1) that the CDA is constitutional because it leaves open ample "alternative channels" of communication; (2) that the plain meaning of the Act's "knowledge" and "specific person" requirement significantly restricts its permissible applications; and (3) that the Act's prohibitions are "almost always" limited to material lacking redeeming social value.
 
The Government first contends that, even though the CDA effectively censors discourse on many of the Internet's modalities--such as chat groups, newsgroups, and mail exploders--it is nonetheless constitutional because it provides a "reasonable opportunity" for speakers to engage in the restricted speech on the World Wide Web. This argument is unpersuasive because the CDA regulates speech on the basis of its content. A "time, place, and manner" analysis is therefore inapplicable. See Consolidated Edison Co. of N.Y. v. Public Serv. Comm'n of N.Y., 447 U.S. 530, 536 (1980). It is thus immaterial whether such speech would be feasible on the Web (which, as the Government's own expert acknowledged, would cost up to $ 10,000 if the speaker's interests were not accommodated by an existing Web site, not including costs for database management and age verification). The Government's position is equivalent to arguing that a statute could ban leaflets on certain subjects as long as individuals are free to publish books. In invalidating a number of laws that banned leafletting on the streets regardless of their content--we explained that "one is not to have the exercise of his liberty of expression in appropriate places abridged on the plea that it may be exercised in some other place." Schneider v. State (Town of Irvington), 308 U.S. 147, 163 (1939).
 
The Government also asserts that the "knowledge" requirement of both §§ 223(a) and (d), especially when coupled with the "specific child" element found in § 223(d), saves the CDA from overbreadth. Because both sections prohibit the dissemination of indecent messages only to persons known to be under 18, the Government argues, it does not require transmitters to "refrain from communicating indecent material to adults; they need only refrain from disseminating such materials to persons they know to be under 18." Brief for Appellants 24. This argument ignores the fact that most Internet fora--including chat rooms, newsgroups, mail exploders, and the Web--are open to all comers. The Government's assertion that the knowledge requirement somehow protects the communications of adults is therefore untenable. Even the strongest reading of the "specific person" requirement of § 223(d) cannot save the statute. It would confer broad powers of censorship, in the form of a "heckler's veto," upon any opponent of indecent speech who might simply log on and inform the would-be discoursers that his 17-year-old child--a "specific person . . . under 18 years of age," 47 U.S.C. A. § 223(d)(1)(A) (Supp. 1997)--would be present.
 
Finally, we find no textual support for the Government's submission that material having scientific, educational, or other redeeming social value will necessarily fall outside the CDA's "patently offensive" and "indecent" prohibitions.

The Government's three remaining arguments focus on the defenses provided in § 223(e)(5). First, relying on the "good faith, reasonable, effective, and appropriate actions" provision, the Government suggests that "tagging" provides a defense that saves the constitutionality of the Act. The suggestion assumes that transmitters may encode their indecent communications in a way that would indicate their contents, thus permitting recipients to block their reception with appropriate software. It is the requirement that the good faith action must be "effective" that makes this defense illusory. The Government recognizes that its proposed screening software does not currently exist. Even if it did, there is no way to know whether a potential recipient will actually block the encoded material. Without the impossible knowledge that every guardian in America is screening for the "tag," the transmitter could not reasonably rely on its action to be "effective."

For its second and third arguments concerning defenses--which we can consider together--the Government relies on the latter half of § 223(e)(5), which applies when the transmitter has restricted access by requiring use of a verified credit card or adult identification. Such verification is not only technologically available but actually is used by commercial providers of sexually explicit material. These providers, therefore, would be protected by the defense. Under the findings of the District Court, however, it is not economically feasible for most noncommercial speakers to employ such verification. Accordingly, this defense would not significantly narrow the statute's burden on noncommercial speech. Even with respect to the commercial pornographers that would be protected by the defense, the Government failed to adduce any evidence that these verification techniques actually preclude minors from posing as adults. Given that the risk of criminal sanctions "hovers over each content provider, like the proverbial sword of  Damocles," the District Court correctly refused to rely on unproven future technology to save the statute. The Government thus failed to prove that the proffered defense would significantly reduce the heavy burden on adult speech produced by the prohibition on offensive displays.

We agree with the District Court's conclusion that the CDA places an unacceptably heavy burden on protected speech, and that the defenses do not constitute the sort of "narrow tailoring" that will save an otherwise patently invalid unconstitutional provision. In Sable, we remarked that the speech restriction at issue there amounted to "'burning the house to roast the pig.'" The CDA, casting a far darker shadow over free speech, threatens to torch a large segment of the Internet community.

In this Court, though not in the District Court, the Government asserts that--in addition to its interest in protecting children--its "equally significant" interest in fostering the growth of the Internet provides an independent basis for upholding the constitutionality of the CDA. The Government apparently assumes that the unregulated availability of "indecent" and "patently offensive" material on the Internet is driving countless citizens away from the medium because of the risk of exposing themselves or their children to harmful material.
 
We find this argument singularly unpersuasive. The dramatic expansion of this new marketplace of ideas contradicts the factual basis of this contention. The record demonstrates that the growth of the Internet has been and continues to be phenomenal. As a matter of constitutional tradition, in the absence of evidence to the contrary, we presume that governmental regulation of the content of speech is more likely to interfere with the free exchange of ideas than to encourage it. The interest in encouraging freedom of expression in a democratic society outweighs any theoretical but unproven benefit of censorship.

For the foregoing reasons, the judgment of the district court is affirmed.

It is so ordered.

DISSENT:  
JUSTICE O'CONNOR, with whom THE CHIEF JUSTICE joins, concurring in the judgment in part and dissenting in part.

I write separately to explain why I view the Communications Decency Act of 1996 (CDA) as little more than an attempt by Congress to create "adult zones" on the Internet. Our precedent indicates that the creation of such zones can be constitutionally sound. Despite the soundness of its purpose, however, portions of the CDA are unconstitutional because they stray from the blueprint our prior cases have developed for constructing a "zoning law" that passes constitutional muster.

Appellees bring a facial challenge to three provisions of the CDA. The first, which the Court describes as the "indecency transmission" provision, makes it a crime to knowingly transmit an obscene or indecent message or image to a person the sender knows is under 18 years old. 47 U.S.C. A. § 223(a)(1)(B) (May 1996 Supp.). What the Court classifies as a single "'patently offensive display'" provision, see ante, at 11, is in reality two separate provisions. The first of these makes it a crime to knowingly send a patently offensive message or image to a specific person under the age of 18 ("specific person" provision). § 223(d)(1)(A). The second criminalizes the display of patently offensive messages or images "in any manner available" to minors ("display" provision). § 223(d)(1)(B). None of these provisions purports to keep indecent (or patently offensive) material away from adults, who have a First Amendment right to obtain this speech. Thus, the undeniable purpose of the CDA is to segregate indecent material on the Internet into certain areas that minors cannot access. See S. Conf. Rep. No. 104-230, p. 189 (1996) (CDA imposes "access restrictions . . . to protect minors from exposure to indecent material").  

The creation of "adult zones" is by no means a novel concept. States have long denied minors access to certain establishments frequented by adults. States have also denied minors access to speech deemed to be "harmful to minors." The Court has previously sustained such zoning laws, but only if they respect the First Amendment rights of adults and minors. That is to say, a zoning law is valid if (i) it does not unduly restrict adult access to the material; and (ii) minors have no First Amendment right to read or view the banned material. As applied to the Internet as it exists in 1997, the "display" provision and some applications of the "indecency transmission" and "specific person" provisions fail to adhere to the first of these limiting principles by restricting adults' access to protected materials in certain circumstances. Unlike the Court, however, I would invalidate the provisions only in those circumstances.

Our cases make clear that a "zoning" law is valid only if adults are still able to obtain the regulated speech. If they cannot, the law does more than simply keep children away from speech they have no right to obtain--it interferes with the rights of adults to obtain constitutionally protected speech and effectively "reduces the adult population . . . to reading only what is fit for children." Butler v. Michigan, 352 U.S. 380, 383 (1957). The First Amendment does not tolerate such interference. See id., at 383 (striking down a Michigan criminal law banning sale of books--to minors or adults--that contained words or pictures that "'tended to . . . corrupt the morals of youth'"); Sable Communications, supra (invalidating federal law that made it a crime to transmit indecent, but nonobscene, commercial telephone messages to minors and adults); Bolger v. Youngs Drug Products Corp., 463 U.S. 60, 74 (1983) (striking down a federal law prohibiting the mailing of unsolicited advertisements for contraceptives). If the law does not unduly restrict adults' access to constitutionally protected speech, however, it may be valid. In Ginsberg v. New York, 390 U.S. 629, 634 (1968), for example, the Court sustained a New York law that barred store owners from selling pornographic magazines to minors in part because adults could still buy those magazines.

The Court in Ginsberg concluded that the New York law created a constitutionally adequate adult zone simply because, on its face, it denied access only to minors. The Court did not question--and therefore necessarily assumed--that an adult zone, once created, would succeed in preserving adults' access while denying minors' access to the regulated speech. Before today, there was no reason to question this assumption, for the Court has previously only considered laws that operated in the physical world, a world that with two characteristics that make it possible to create "adult zones": geography and identity. See Lessig, Reading the Constitution in Cyberspace, 45 Emory L. J. 869, 886 (1996). A minor can see an adult dance show only if he enters an establishment that provides such entertainment. And should he attempt to do so, the minor will not be able to conceal completely his identity (or, consequently, his age). Thus, the twin characteristics of geography and identity enable the establishment's proprietor to prevent children from entering the establishment, but to let adults inside.

The electronic world is fundamentally different. Because it is no more than the interconnection of electronic pathways, cyberspace allows speakers and listeners to mask their identities. Cyberspace undeniably reflects some form of geography; chat rooms and Web sites, for example, exist at fixed "locations" on the Internet. Since users can transmit and receive messages on the Internet without revealing anything about their identities or ages, see Lessig, supra, at 901, however, it is not currently possible to exclude persons from accessing certain messages on the basis of their identity.

Cyberspace differs from the physical world in another basic way: Cyberspace is malleable. Thus, it is possible to construct barriers in cyberspace and use them to screen for identity, making cyberspace more like the physical world and, consequently, more amenable to zoning laws. This transformation of cyberspace is already underway. Lessig, supra, at 888-889. Id., at 887 (cyberspace "is moving . . . from a relatively unzoned place to a universe that is extraordinarily well zoned"). Internet speakers (users who post material on the Internet) have begun to zone cyberspace itself through the use of "gateway" technology. Such technology requires Internet users to enter information about themselves--perhaps an adult identification number or a credit card number--before they can access certain areas of cyberspace, much like a bouncer checks a person's driver's license before admitting him to a nightclub. Internet users who access information have not attempted to zone cyberspace itself, but have tried to limit their own power to access information in cyberspace, much as a parent controls what her children watch on television by installing a lock box. This user-based zoning is accomplished through the use of screening software (such as Cyber Patrol or SurfWatch) or browsers with screening capabilities, both of which search addresses and text for keywords that are associated with "adult" sites and, if the user wishes, blocks access to such sites. The Platform for Internet Content Selection (PICS) project is designed to facilitate user-based zoning by encouraging Internet speakers to rate the content  of their speech using codes recognized by all screening programs.

Despite this progress, the transformation of cyberspace is not complete. Although gateway technology has been available on the World Wide Web for some time now, it is not available to all Web speakers, and is just now becoming technologically feasible for chat rooms and USENET newsgroups. Gateway technology is not ubiquitous in cyberspace, and because without it "there is no means of age verification," cyberspace still remains largely unzoned--and unzoneable. User-based zoning is also in its infancy. For it to be effective, (i) an agreed-upon code (or "tag") would have to exist; (ii) screening software or browsers with screening capabilities would have to be able to recognize the "tag"; and (iii) those programs would have to be widely available--and widely used--by Internet users. At present, none of these conditions is true. Screening software "is not in wide use today" and "only a handful of browsers have screening capabilities." There is, moreover, no agreed-upon "tag" for those programs to recognize.

Although the prospects for the eventual zoning of the Internet appear promising, I agree with the Court that we must evaluate the constitutionality of the CDA as it applies to the Internet as it exists today. Ante, at 36. Given the present state of cyberspace, I agree with the Court that the "display" provision cannot pass muster. Until gateway technology is available throughout cyberspace, and it is not in 1997, a speaker cannot be reasonably assured that the speech he displays will reach only adults because it is impossible to confine speech to an "adult zone." Thus, the only way for a speaker to avoid liability under the CDA is to refrain completely from using indecent speech. But this forced silence impinges on the First Amendment right of adults to make and obtain this speech and, for all intents and purposes, "reduces the adult population [on the Internet] to reading only what is fit for children." Butler, 352 U.S. at 383. As a result, the "display" provision cannot withstand scrutiny.

The "indecency transmission" and "specific person" provisions present a closer issue, for they are not unconstitutional in all of their applications. As discussed above, the "indecency transmission" provision makes it a crime to transmit knowingly an indecent message to a person the sender knows is under 18 years of age. 47 U.S.C. A. § 223(a)(1)(B) (May 1996 Supp.). The "specific person" provision proscribes the same conduct, although it does not as explicitly require the sender to know that the intended recipient of his indecent message is a minor. § 223(d)(1)(A). Appellant urges the Court to construe the provision to impose such a knowledge requirement, and I would do so.

So construed, both provisions are constitutional as applied to a conversation involving only an adult and one or more minors--e.g., when an adult speaker sends an e-mail knowing the addressee is a minor, or when an adult and minor converse by themselves or with other minors in a chat room. In this context, these provisions are no different from the law we sustained in Ginsberg. Restricting what the adult may say to the minors in no way restricts the adult's ability to communicate with other adults. He is not prevented from speaking indecently to other adults in a chat room (because there are no other adults participating in the conversation) and he remains free to send indecent e-mails to other adults. The relevant universe contains only one adult, and the adult in that universe has the power to refrain from using indecent speech and consequently to keep all such speech within the room in an "adult" zone.

The analogy to Ginsberg breaks down, however, when more than one adult is a party to the conversation. If a minor enters a chat room otherwise occupied by adults, the CDA effectively requires the adults in the room to stop using indecent speech. If they did not, they could be prosecuted under the "indecency transmission" and "specific person" provisions for any indecent statements they make to the group, since they would be transmitting an indecent message to specific persons, one of whom is a minor. The CDA is therefore akin to a law that makes it a crime for a bookstore owner to sell pornographic magazines to anyone once a minor enters his store. Even assuming such a law might be constitutional in the physical world as a reasonable alternative to excluding minors completely from the store, the absence of any means of excluding minors from chat rooms in cyberspace restricts the rights of adults to engage in indecent speech in those rooms. The "indecency transmission" and "specific person" provisions share this defect.

But these two provisions do not infringe on adults' speech in all situations. And as discussed below, I do not find that the provisions are overbroad in the sense that they restrict minors' access to a substantial amount of speech that minors have the right to read and view. Accordingly, the CDA can be applied constitutionally in some situations. Normally, this fact would require the Court to reject a direct facial challenge. Appellees' claim arises under the First Amendment, however, and they argue that the CDA is facially invalid because it is "substantially overbroad"--that is, it "sweeps too broadly . . . [and] penalizes a substantial amount of speech that is constitutionally protected." I agree with the Court that the provisions are overbroad in that they cover any and all communications between adults and minors, regardless of how many adults might be part of the audience to the communication.

This conclusion does not end the matter, however. Where, as here, "the parties challenging the statute are those who desire to engage in protected speech that the overbroad statute purports to punish . . . the statute may forthwith be declared invalid to the extent that it reaches too far, but otherwise left intact." There is no question that Congress intended to prohibit certain communications between one adult and one or more minors. See 47 U.S.C. A. § 223(a)(1)(B) (May 1996 Supp.) (punishing "whoever . . . initiates the transmission of [any indecent communication] knowingly that the recipient of the communication is under 18 years of age"); § 223(d)(1)(A) (punishing "whoever . . . sends to a specific person or persons under 18 years of age [a patently offensive message]"). There is also no question that Congress would have enacted a narrower version of these provisions had it known a broader version would be declared unconstitutional. 47 U.S.C. § 608 ("If . . . the application [of any provision of the CDA] to any person or circumstance is held invalid, . . . the application of such provision to other persons or circumstances shall not be affected thereby"). I would therefore sustain the "indecency transmission" and "specific person" provisions to the extent they apply to the transmission of Internet communications where the party initiating the communication knows that all of the recipients are minors.

II
Whether the CDA substantially interferes with the First Amendment rights of minors, and thereby runs afoul of the second characteristic of valid zoning laws, presents a closer question. In Ginsberg, the New York law we sustained prohibited the sale to minors of magazines that were "harmful to minors." Under that law, a magazine was "harmful to minors" only if it was obscene as to minors. Noting that obscene speech is not protected by the First Amendment, and that New York was constitutionally free to adjust the definition of obscenity for minors, the Court concluded that the law did not "invade the area of freedom of expression constitutionally secured to minors." New York therefore did not infringe upon the First Amendment rights of minors.

The Court neither "accepts nor rejects" the argument that the CDA is facially overbroad because it substantially interferes with the First Amendment rights of minors. I would reject it. Ginsberg established that minors may constitutionally be denied access to material that is obscene as to minors. As Ginsberg explained, material is obscene as to minors if it (i) is "patently offensive to prevailing standards in the adult community as a whole with respect to what is suitable . . . for minors"; (ii) appeals to the prurient interest of minors; and (iii) is "utterly without redeeming social importance for minors." Because the CDA denies minors the right to obtain material that is "patently offensive"--even if it has some redeeming value for minors and even if it does not appeal to their prurient interests--Congress' rejection of the Ginsberg "harmful to minors" standard means that the CDA could ban some speech that is "indecent" (i.e., "patently offensive") but that is not obscene as to minors.

I do not deny this possibility, but to prevail in a facial challenge, it is not enough for a plaintiff to show "some" overbreadth. Our cases require a proof of "real" and "substantial" overbreadth, and appellees have not carried their burden in this case. In my view, the universe of speech constitutionally protected as to minors but banned by the CDA--i.e., the universe of material that is "patently offensive," but which nonetheless has some redeeming value for minors or does not appeal to their prurient interest--is a very small one. Appellees cite no examples of speech falling within this universe and do not attempt to explain why that universe is substantial "in relation to the statute's plainly legitimate sweep." That the CDA might deny minors the right to obtain material that has some "value," is largely beside the point. While discussions about prison rape or nude art, may have some redeeming education value for adults, they do not necessarily have any such value for minors, and under Ginsberg, minors only have a First Amendment right to obtain patently offensive material that has "redeeming social importance for minors." There is also no evidence in the record to support the contention that "many [e]-mail transmissions from an adult  to a minor are conversations between family members," and no support for the legal proposition that such speech is absolutely immune from regulation. Accordingly, in my view, the CDA does not burden a substantial amount of minors' constitutionally protected speech.

Thus, the constitutionality of the CDA as a zoning law hinges on the extent to which it substantially interferes with the First Amendment rights of adults. Because the rights of adults are infringed only by the "display" provision and by the "indecency transmission" and "specific person" provisions as applied to communications involving more than one adult, I would invalidate the CDA only to that extent. Insofar as the "indecency transmission" and "specific person" provisions prohibit the use of indecent speech in communications between an adult and one or more minors, however, they can and should be sustained. The Court reaches a contrary conclusion, and from that holding that I respectfully dissent.

United States v. American Library Association, 539 U.S. 194 (2003)

Chief Justice Rehnquist announced the judgment of the Court and delivered an opinion, in which Justice O'Connor, Justice Scalia, and Justice Thomas joined.

To address the problems associated with the availability of Internet pornography in public libraries, Congress enacted the Children's Internet Protection Act (CIPA), 114 Stat. 2763A-335. Under CIPA, a public library may not receive federal assistance to provide Internet access unless it installs software to block images that constitute obscenity or child pornography, and to prevent minors from obtaining access to material that is harmful to them. The District Court held these provisions facially invalid on the ground that they induce public libraries to violate patrons' First Amendment rights. We now reverse.

To help public libraries provide their patrons with Internet access, Congress offers two forms of federal assistance. First, the E-rate program established by the Telecommunications Act of 1996 entitles qualifying libraries to buy Internet access at a discount. In the year ending June 30, 2002, libraries received $58.5 million in such discounts. Second, pursuant to the Library Services and Technology Act (LSTA), the Institute of Museum and Library Services makes grants to state library administrative agencies to "electronically lin[k] libraries with educational, social, or information services," "assis[t] libraries in accessing information through electronic networks," and "pa[y] costs for libraries to acquire or share computer systems and telecommunications technologies." In fiscal year 2002, Congress appropriated more than $149 million in LSTA grants. These programs have succeeded greatly in bringing Internet access to public libraries: By 2000, 95% of the Nation's libraries provided public Internet access.

 By connecting to the Internet, public libraries provide patrons with a vast amount of valuable information. But there is also an enormous amount of pornography on the Internet, much of which is easily obtained. 201 F. Supp. 2d 401, 419 (ED Pa. 2002). The accessibility of this material has created serious problems for libraries, which have found that patrons of all ages, including minors, regularly search for online pornography. Id., at 406. Some patrons also expose others to pornographic images by leaving them displayed on Internet terminals or printed at library printers. Id., at 423.

Upon discovering these problems, Congress became concerned that the E-rate and LSTA programs were facilitating access to illegal and harmful pornography. S. Rep. No. 105-226, p. 5 (1998). Congress learned that adults "us[e] library computers to access pornography that is then exposed to staff, passersby, and children," and that "minors acces[s] child and adult pornography in libraries."

But Congress also learned that filtering software that blocks access to pornographic Web sites could provide a reasonably effective way to prevent such uses of library resources. Id., at 20-26. By 2000, before Congress enacted CIPA, almost 17% of public libraries used such software on at least some of their Internet terminals, and 7% had filters on all of them. A library can set such software to block categories of material, such as "Pornography" or "Violence." 201 F. Supp. 2d, at 428. When a patron tries to view a site that falls within such a category, a screen appears indicating that the site is blocked. Id., at 429. But a filter set to block pornography may sometimes block other sites that present neither obscene nor pornographic material, but that nevertheless trigger the filter. To minimize this problem, a library can set its software to prevent the blocking of material that falls into categories like "Education," "History," and "Medical." Id., at 428-429. A library may also add or delete specific sites from a blocking category, id., at 429, and anyone can ask companies that furnish filtering software to unblock particular sites, id., at 430.

Responding to this information, Congress enacted CIPA. It provides that a library may not receive E-rate or LSTA assistance unless it has "a policy of Internet safety for minors that includes the operation of a technology protection measure ... that protects against access" by all persons to "visual depictions" that constitute "obscen[ity]" or "child pornography," and that protects against access by minors to "visual depictions" that are "harmful to minors." 20 U. S. C. §§9134(f)(1)(A)(i) and (B)(i); 47 U. S. C. §§254(h)(6)(B)(i) and (C)(i). The statute defines a "[t]echnology protection measure" as "a specific technology that blocks or filters Internet access to material covered by" CIPA. §254(h)(7)(I). CIPA also permits the library to "disable" the filter "to enable access for bona fide research or other lawful purposes." 20 U. S. C. §9134(f)(3); 47 U. S. C. §254(h)(6)(D). Under the E-rate program, disabling is permitted "during use by an adult." §254(h)(6)(D). Under the LSTA program, disabling is permitted during use by any person. 20 U. S. C. §9134(f)(3).

Appellees are a group of libraries, library associations, library patrons, and Web site publishers, including the American Library Association (ALA) and the Multnomah County Public Library in Portland, Oregon (Multnomah). They sued the United States and the Government agencies and officials responsible for administering the E-rate and LSTA programs in District Court, challenging the constitutionality of CIPA's filtering provisions. A three-judge District Court convened pursuant to §1741(a) of CIPA, 114 Stat. 2763A-351, note following 20 U. S. C. §7001.

After a trial, the District Court ruled that CIPA was facially unconstitutional and enjoined the relevant agencies and officials from withholding federal assistance for failure to comply with CIPA. The District Court held that Congress had exceeded its authority under the Spending Clause, U. S. Const., Art. I, §8, cl. 1, because, in the court's view, "any public library that complies with CIPA's conditions will necessarily violate the First Amendment." 201 F. Supp. 2d, at 453. The court acknowledged that "generally the First Amendment subjects libraries' content-based decisions about which print materials to acquire for their collections to only rational [basis] review." Id., at 462. But it distinguished libraries' decisions to make certain Internet material inaccessible. "The central difference," the court stated, "is that by providing patrons with even filtered Internet access, the library permits patrons to receive speech on a virtually unlimited number of topics, from a virtually unlimited number of speakers, without attempting to restrict patrons' access to speech that the library, in the exercise of its professional judgment, determines to be particularly valuable." Ibid. Reasoning that "the provision of Internet access within a public library ... is for use by the public ... for expressive activity," the court analyzed such access as a "designated public forum." Id., at 457 (citation and internal quotation marks omitted). The District Court also likened Internet access in libraries to "traditional public fora ... such as sidewalks and parks" because it "promotes First Amendment values in an analogous manner." Id., at 466.

Based on both of these grounds, the court held that the filtering software contemplated by CIPA was a content-based restriction on access to a public forum, and was therefore subject to strict scrutiny. Ibid. Applying this standard, the District Court held that, although the Government has a compelling interest "in preventing the dissemination of obscenity, child pornography, or, in the case of minors, material harmful to minors," id., at 471, the use of software filters is not narrowly tailored to further those interests, id., at 479. We noted probable jurisdiction, 537 U. S. 1017 (2002), and now reverse.

Congress has wide latitude to attach conditions to the receipt of federal assistance in order to further its policy objectives. South Dakota v. Dole, 483 U. S. 203, 206 (1987). But Congress may not "induce" the recipient "to engage in activities that would themselves be unconstitutional." Id., at 210. To determine whether libraries would violate the First Amendment by employing the filtering software that CIPA requires, we must first examine the role of libraries in our society.

Public libraries pursue the worthy missions of facilitating learning and cultural enrichment. Appellee ALA's Library Bill of Rights states that libraries should provide "[b]ooks and other ... resources ... for the interest, information, and enlightenment of all people of the community the library serves." 201 F. Supp. 2d, at 420 (internal quotation marks omitted). To fulfill their traditional missions, public libraries must have broad discretion to decide what material to provide to their patrons. Although they seek to provide a wide array of information, their goal has never been to provide "universal coverage." Id., at 421. Instead, public libraries seek to provide materials "that would be of the greatest direct benefit or interest to the community." Ibid. To this end, libraries collect only those materials deemed to have "requisite and appropriate quality." Ibid. See W. Katz, Collection Development: The Selection of Materials for Libraries 6 (1980) ("The librarian's responsibility ... is to separate out the gold from the garbage, not to preserve everything"); F. Drury, Book Selection xi (1930) ("[I]t is the aim of the selector to give the public, not everything it wants, but the best that it will read or use to advantage"); App. 636 (Rebuttal Expert Report of Donald G. Davis, Jr.) ("A hypothetical collection of everything that has been produced is not only of dubious value, but actually detrimental to users trying to find what they want to find and really need").

We have held in two analogous contexts that the government has broad discretion to make content-based judgments in deciding what private speech to make available to the public. In Arkansas Ed. Television Comm'n v. Forbes, 523 U. S. 666, 672-673 (1998), we held that public forum principles do not generally apply to a public television station's editorial judgments regarding the private speech it presents to its viewers. "[B]road rights of access for outside speakers would be antithetical, as a general rule, to the discretion that stations and their editorial staff must exercise to fulfill their journalistic purpose and statutory obligations." Id., at 673. Recognizing a broad right of public access "would [also] risk implicating the courts in judgments that should be left to the exercise of journalistic discretion." Id., at 674.

Similarly, in National Endowment for Arts v. Finley, 524 U. S. 569 (1998), we upheld an art funding program that required the National Endowment for the Arts (NEA) to use content-based criteria in making funding decisions. We explained that "[a]ny content-based considerations that may be taken into account in the grant-making pro-

cess are a consequence of the nature of arts funding." Id., at 585. In particular, "[t]he very assumption of the NEA is that grants will be awarded according to the 'artistic worth of competing applicants,' and absolute neutrality is simply inconceivable." Ibid. (some internal quotation marks omitted). We expressly declined to apply forum analysis, reasoning that it would conflict with "NEA's mandate ... to make esthetic judgments, and the inherently content-based 'excellence' threshold for NEA support." Id., at 586.

The principles underlying Forbes and Finley also apply to a public library's exercise of judgment in selecting the material it provides to its patrons. Just as forum analysis and heightened judicial scrutiny are incompatible with the role of public television stations and the role of the NEA, they are also incompatible with the discretion that public libraries must have to fulfill their traditional missions. Public library staffs necessarily consider content in making collection decisions and enjoy broad discretion in making them.

The public forum principles on which the District Court relied, 201 F. Supp. 2d, at 457-470, are out of place in the context of this case. Internet access in public libraries is neither a "traditional" nor a "designated" public forum. See Cornelius v. NAACP Legal Defense & Ed. Fund, Inc., 473 U. S. 788, 802 (1985) (describing types of forums). First, this resource--which did not exist until quite recently--has not "immemorially been held in trust for the use of the public and, time out of mind, ... been used for purposes of assembly, communication of thoughts between citizens, and discussing public questions." International Soc. for Krishna Consciousness, Inc. v. Lee, 505 U. S. 672, 679 (1992) (internal quotation marks omitted). We have "rejected the view that traditional public forum status extends beyond its historic confines." Forbes, supra, at 678. The doctrines surrounding traditional public forums may not be extended to situations where such history is lacking.

Nor does Internet access in a public library satisfy our definition of a "designated public forum." To create such a forum, the government must make an affirmative choice to open up its property for use as a public forum. Cornelius, supra, at 802-803; Perry Ed. Assn. v. Perry Local Educators' Assn., 460 U. S. 37, 45 (1983). "The government does not create a public forum by inaction or by permitting limited discourse, but only by intentionally opening a non-traditional forum for public discourse." Cornelius, supra, at 802. The District Court likened public libraries' Internet terminals to the forum at issue in Rosenberger v. Rector and Visitors of Univ. of Va., 515 U. S. 819 (1995). 201 F. Supp. 2d, at 465. In Rosenberger, we considered the "Student Activity Fund" established by the University of Virginia that subsidized all manner of student publications except those based on religion. We held that the fund had created a limited public forum by giving public money to student groups who wished to publish, and therefore could not discriminate on the basis of viewpoint.

The situation here is very different. A public library does not acquire Internet terminals in order to create a public forum for Web publishers to express themselves, any more than it collects books in order to provide a public forum for the authors of books to speak. It provides Internet access, not to "encourage a diversity of views from private speakers," Rosenberger, supra, at 834, but for the same reasons it offers other library resources: to facilitate research, learning, and recreational pursuits by furnishing materials of requisite and appropriate quality. See Cornelius, supra, at 805 (noting, in upholding limits on participation in the Combined Federal Campaign (CFC), that "[t]he Government did not create the CFC for purposes of providing a forum for expressive activity"). As Congress recognized, "[t]he Internet is simply another method for making information available in a school or library." S. Rep. No. 106-141, p. 7 (1999). It is "no more than a technological extension of the book stack." Ibid.

The District Court disagreed because, whereas a library reviews and affirmatively chooses to acquire every book in its collection, it does not review every Web site that it makes available. 201 F. Supp. 2d, at 462-463. Based on this distinction, the court reasoned that a public library enjoys less discretion in deciding which Internet materials to make available than in making book selections. Ibid. We do not find this distinction constitutionally relevant. A library's failure to make quality-based judgments about all the material it furnishes from the Web does not somehow taint the judgments it does make. A library's need to exercise judgment in making collection decisions depends on its traditional role in identifying suitable and worthwhile material; it is no less entitled to play that role when it collects material from the Internet than when it collects material from any other source. Most libraries already exclude pornography from their print collections because they deem it inappropriate for inclusion. We do not subject these decisions to heightened scrutiny; it would make little sense to treat libraries' judgments to block online pornography any differently, when these judgments are made for just the same reason.

Moreover, because of the vast quantity of material on the Internet and the rapid pace at which it changes, libraries cannot possibly segregate, item by item, all the Internet material that is appropriate for inclusion from all that is not. While a library could limit its Internet collection to just those sites it found worthwhile, it could do so only at the cost of excluding an enormous amount of valuable information that it lacks the capacity to review. Given that tradeoff, it is entirely reasonable for public libraries to reject that approach and instead exclude certain categories of content, without making individualized judgments that everything they do make available has requisite and appropriate quality.

Like the District Court, the dissents fault the tendency of filtering software to "overblock"--that is, to erroneously block access to constitutionally protected speech that falls outside the categories that software users intend to block. See post, at 1-3 (opinion of Stevens, J.); post, at 3-4 (opinion of Souter, J.). Due to the software's limitations, "[m]any erroneously blocked [Web] pages contain content that is completely innocuous for both adults and minors, and that no rational person could conclude matches the filtering companies' category definitions, such as 'pornography' or 'sex.' " 201 F. Supp. 2d, at 449. Assuming that such erroneous blocking presents constitutional difficulties, any such concerns are dispelled by the ease with which patrons may have the filtering software disabled. When a patron encounters a blocked site, he need only ask a librarian to unblock it or (at least in the case of adults) disable the filter. As the District Court found, libraries have the capacity to permanently unblock any erroneously blocked site, id., at 429, and the Solicitor General stated at oral argument that a "library may ... eliminate the filtering with respect to specific sites ... at the request of a patron." Tr. of Oral Arg. 4. With respect to adults, CIPA also expressly authorizes library officials to "disable" a filter altogether "to enable access for bona fide research or other lawful purposes." 20 U. S. C. §9134(f)(3) (disabling permitted for both adults and minors); 47 U. S. C. §254(h)(6)(D) (disabling permitted for adults). The Solicitor General confirmed that a "librarian can, in response to a request from a patron, unblock the filtering mechanism altogether," Tr. of Oral Arg. 11, and further explained that a patron would not "have to explain ... why he was asking a site to be unblocked or the filtering to be disabled," id., at 4. The District Court viewed unblocking and disabling as inadequate because some patrons may be too embarrassed to request them. 201 F. Supp. 2d, at 411. But the Constitution does not guarantee the right to acquire information at a public library without any risk of embarrassment.

Appellees urge us to affirm the District Court's judgment on the alternative ground that CIPA imposes an unconstitutional condition on the receipt of federal assistance. Under this doctrine, "the government 'may not deny a benefit to a person on a basis that infringes his constitutionally protected ... freedom of speech' even if he has no entitlement to that benefit." Board of Comm'rs, Wabaunsee Cty. v. Umbehr, 518 U. S. 668, 674 (1996) (quoting Perry v. Sindermann, 408 U. S. 593, 597 (1972)). Appellees argue that CIPA imposes an unconstitutional condition on libraries that receive E-rate and LSTA subsidies by requiring them, as a condition on their receipt of federal funds, to surrender their First Amendment right to provide the public with access to constitutionally protected speech. The Government counters that this claim fails because Government entities do not have First Amendment rights. See Columbia Broadcasting System, Inc. v. Democratic National Committee, 412 U. S. 94, 139 (1973) (Stewart, J., concurring) ("The First Amendment protects the press from governmental interference; it confers no analogous protection on the government"); id., at 139, n. 7 (" 'The purpose of the First Amendment is to protect private expression' " (quoting T. Emerson, The System of Freedom of Expression 700 (1970))).

We need not decide this question because, even assuming that appellees may assert an "unconstitutional conditions" claim, this claim would fail on the merits. Within broad limits, "when the Government appropriates public funds to establish a program it is entitled to define the limits of that program." Rust v. Sullivan, 500 U. S. 173, 194 (1991). In Rust, Congress had appropriated federal funding for family planning services and forbidden the use of such funds in programs that provided abortion counseling. Id., at 178. Recipients of these funds challenged this restriction, arguing that it impermissibly conditioned the receipt of a benefit on the relinquishment of their constitutional right to engage in abortion counseling. Id., at 196. We rejected that claim, recognizing that "the Government [was] not denying a benefit to anyone, but [was] instead simply insisting that public funds be spent for the purposes for which they were authorized." Ibid.

The same is true here. The E-rate and LSTA programs were intended to help public libraries fulfill their traditional role of obtaining material of requisite and appropriate quality for educational and informational purposes. Congress may certainly insist that these "public funds be spent for the purposes for which they were authorized." Ibid. Especially because public libraries have traditionally excluded pornographic material from their other collections, Congress could reasonably impose a parallel limitation on its Internet assistance programs. As the use of filtering software helps to carry out these programs, it is a permissible condition under Rust.

Justice Stevens asserts the premise that "[a] federal statute penalizing a library for failing to install filtering software on every one of its Internet-accessible computers would unquestionably violate [the First] Amendment." Post, at 8. See also post, at 12. But--assuming again that public libraries have First Amendment rights--CIPA does not "penalize" libraries that choose not to install such software, or deny them the right to provide their patrons with unfiltered Internet access. Rather, CIPA simply reflects Congress' decision not to subsidize their doing so. To the extent that libraries wish to offer unfiltered access, they are free to do so without federal assistance. " 'A refusal to fund protected activity, without more, cannot be equated with the imposition of a 'penalty' on that activity.' " Rust, supra, at 193.

Appellees mistakenly contend, in reliance on Legal Services Corporation v. Velazquez, 531 U. S. 533 (2001), that CIPA's filtering conditions "[d]istor[t] the [u]sual [f]unctioning of [p]ublic [l]ibraries." Brief for Appellees ALA et al. 40 (citing Velazquez, supra, at 543); Brief for Appellees Multnomah et al. 47-48 (same). In Velazquez, the Court concluded that a Government program of furnishing legal aid to the indigent differed from the program in Rust "[i]n th[e] vital respect" that the role of lawyers who represent clients in welfare disputes is to advocate against the Government, and there was thus an assumption that counsel would be free of state control. 531 U. S., at 542-543. The Court concluded that the restriction on advocacy in such welfare disputes would distort the usual functioning of the legal profession and the federal and state courts before which the lawyers appeared. Public libraries, by contrast, have no comparable role that pits them against the Government, and there is no comparable assumption that they must be free of any conditions that their benefactors might attach to the use of donated funds or other assistance.

Because public libraries' use of Internet filtering software does not violate their patrons' First Amendment rights, CIPA does not induce libraries to violate the Constitution, and is a valid exercise of Congress' spending power. Nor does CIPA impose an unconstitutional condition on public libraries. Therefore, the judgment of the District Court for the Eastern District of Pennsylvania is Reversed.

  Justice Kennedy, concurring in the judgment.

If, on the request of an adult user, a librarian will unblock filtered material or disable the Internet software filter without significant delay, there is little to this case. The Government represents this is indeed the fact.

The District Court, in its "Preliminary Statement," did say that "the unblocking may take days, and may be unavailable, especially in branch libraries, which are often less well staffed than main libraries." 201 F. Supp. 2d 401, 411 (ED Pa. 2002). See also post, at 2 (Souter, J., dissenting). That statement, however, does not appear to be a specific finding. It was not the basis for the District Court's decision in any event, as the court assumed that "the disabling provisions permit public libraries to allow a patron access to any speech that is constitutionally protected with respect to that patron." Id., at 485-486.

If some libraries do not have the capacity to unblock specific Web sites or to disable the filter or if it is shown that an adult user's election to view constitutionally protected Internet material is burdened in some other substantial way, that would be the subject for an as-applied challenge, not the facial challenge made in this case. See post, at 5-6 (Breyer, J., concurring in judgment).

There are, of course, substantial Government interests at stake here. The interest in protecting young library users from material inappropriate for minors is legitimate, and even compelling, as all Members of the Court appear to agree. Given this interest, and the failure to show that the ability of adult library users to have access to the material is burdened in any significant degree, the statute is not unconstitutional on its face. For these reasons, I concur in the judgment of the Court.

Justice Breyer, concurring in the judgment.

…I would apply a form of heightened scrutiny, examining the statutory requirements in question with special care. The Act directly restricts the public's receipt of information. See Stanley v. Georgia, 394 U. S. 557, 564 (1969) ("[T]he Constitution protects the right to receive information and ideas"); Reno v. American Civil Liberties Union, 521 U. S. 844, 874 (1997). And it does so through limitations imposed by outside bodies (here Congress) upon two critically important sources of information--the Internet as accessed via public libraries. For that reason, we should not examine the statute's constitutionality as if it raised no special First Amendment concern--as if, like tax or economic regulation, the First Amendment demanded only a "rational basis" for imposing a restriction. Nor should we accept the Government's suggestion that a presumption in favor of the statute's constitutionality applies.

At the same time, in my view, the First Amendment does not here demand application of the most limiting constitutional approach--that of "strict scrutiny." The statutory restriction in question is, in essence, a kind of "selection" restriction (a kind of editing). It affects the kinds and amount of materials that the library can present to its patrons. See ante, at 6-7, 10-11 (plurality opinion). And libraries often properly engage in the selection of materials, either as a matter of necessity (i.e., due to the scarcity of resources) or by design (i.e., in accordance with collection development policies). See, e.g., 201 F. Supp. 2d, at 408-409, 421, 462; ante, at 6-7, 11 (plurality opinion). To apply "strict scrutiny" to the "selection" of a library's collection (whether carried out by public libraries themselves or by other community bodies with a traditional legal right to engage in that function) would unreasonably interfere with the discretion necessary to create, maintain, or select a library's "collection" (broadly defined to include all the information the library makes available). Cf. Miami Herald Publishing Co. v. Tornillo, 418 U. S. 241, 256-258 (1974) (protecting newspaper's exercise of editorial control and judgment). That is to say, "strict scrutiny" implies too limiting and rigid a test for me to believe that the First Amendment requires it in this context.

Instead, I would examine the constitutionality of the Act's restrictions here as the Court has examined speech-related restrictions in other contexts where circumstances call for heightened, but not "strict," scrutiny--where, for example, complex, competing constitutional interests are potentially at issue or speech-related harm is potentially justified by unusually strong governmental interests. Typically the key question in such instances is one of proper fit. See, e.g., Board of Trustees of State Univ. of N. Y. v. Fox, 492 U. S. 469 (1989); Denver Area Ed. Telecommunications Consortium, Inc. v. FCC, 518 U. S. 727, 740-747 (1996) (plurality opinion); Turner Broadcasting System, Inc. v. FCC, 520 U. S. 180, 227 (1997) (Breyer, J., concurring in part); Red Lion Broadcasting Co. v. FCC, 395 U. S. 367, 389-390 (1969).

In such cases the Court has asked whether the harm to speech-related interests is disproportionate in light of both the justifications and the potential alternatives. It has considered the legitimacy of the statute's objective, the extent to which the statute will tend to achieve that objective, whether there are other, less restrictive ways of achieving that objective, and ultimately whether the statute works speech-related harm that, in relation to that objective, is out of proportion. In Fox, supra, at 480, for example, the Court stated:

"What our decisions require is a 'fit' between the legislature's ends and the means chosen to accomplish those ends--a fit that is not necessarily perfect, but reasonable; that represents not necessarily the single best disposition but one whose scope is in proportion to the interest served; that employs not necessarily the least restrictive means but, as we have put it in the other contexts ..., a means narrowly tailored to achieve the desired objective." (Internal quotation marks and citations omitted.)

Cf., e.g., Central Hudson Gas & Elec. Corp. v. Public Serv. Comm'n of N. Y., 447 U. S. 557, 564 (1980); United States v. O'Brien, 391 U. S. 367, 377 (1968); Clark v. Community for Creative Non-Violence, 468 U. S. 288, 293 (1984). This approach does not substitute a form of "balancing" for less flexible, though more speech-protective, forms of "strict scrutiny." Rather, it supplements the latter with an approach that is more flexible but nonetheless provides the legislature with less than ordinary leeway in light of the fact that constitutionally protected expression is at issue. Cf. Fox, supra, at 480-481; Virginia Bd. of Pharmacy v. Virginia Citizens Consumer Council, Inc., 425 U. S. 748, 769-773 (1976).

The Act's restrictions satisfy these constitutional demands. The Act seeks to restrict access to obscenity, child pornography, and, in respect to access by minors, material that is comparably harmful. These objectives are "legitimate," and indeed often "compelling." See, e.g., Miller v. California, 413 U. S. 15, 18 (1973) (interest in prohibiting access to obscene material is "legitimate"); Reno, supra, at 869-870 (interest in "shielding" minors from exposure to indecent material is " 'compelling' "); New York v. Ferber, 458 U. S. 747, 756-757 (1982) (same). As the District Court found, software filters "provide a relatively cheap and effective" means of furthering these goals. 201 F. Supp. 2d, at 448. Due to present technological limitations, however, the software filters both "overblock," screening out some perfectly legitimate material, and "underblock," allowing some obscene material to escape detection by the filter. Id., at 448-449. See ante, at 11-12 (plurality opinion). But no one has presented any clearly superior or better fitting alternatives. See ante, at 10, n. 3 (plurality opinion).

At the same time, the Act contains an important exception that limits the speech-related harm that "overblocking" might cause. As the plurality points out, the Act allows libraries to permit any adult patron access to an "overblocked" Web site; the adult patron need only ask a librarian to unblock the specific Web site or, alternatively, ask the librarian, "Please disable the entire filter." See ante, at 12; 20 U. S. C. §9134(f)(3) (permitting library officials to "disable a technology protection measure ... to enable access for bona fide research or other lawful purposes"); 47 U. S. C. §254(h)(6)(D) (same).

The Act does impose upon the patron the burden of making this request. But it is difficult to see how that burden (or any delay associated with compliance) could prove more onerous than traditional library practices associated with segregating library materials in, say, closed stacks, or with interlibrary lending practices that require patrons to make requests that are not anonymous and to wait while the librarian obtains the desired materials from elsewhere. Perhaps local library rules or practices could further restrict the ability of patrons to obtain "overblocked" Internet material. See, e.g., In re Federal-State Joint Board on Universal Service: Children's Internet Protection Act, 16 FCC Rcd. 8182, 8183, ¶ ;2, 8204, ¶ ;53 (2001) (leaving determinations regarding the appropriateness of compliant Internet safety policies and their disabling to local communities). But we are not now considering any such local practices. We here consider only a facial challenge to the Act itself.

Given the comparatively small burden that the Act imposes upon the library patron seeking legitimate Internet materials, I cannot say that any speech-related harm that the Act may cause is disproportionate when considered in relation to the Act's legitimate objectives. I therefore agree with the plurality that the statute does

not violate the First Amendment, and I concur in the judgment.

Justice Stevens, dissenting.

"To fulfill their traditional missions, public libraries must have broad discretion to decide what material to provide their patrons." Ante, at 6. Accordingly, I agree with the plurality that it is neither inappropriate nor unconstitutional for a local library to experiment with filtering software as a means of curtailing children's access to Internet Web sites displaying sexually explicit images. I also agree with the plurality that the 7% of public libraries that decided to use such software on all of their Internet terminals in 2000 did not act unlawfully. Ante, at 3. Whether it is constitutional for the Congress of the United States to impose that requirement on the other 93%, however, raises a vastly different question. Rather than allowing local decisionmakers to tailor their responses to local problems, the Children's Internet Protection Act (CIPA) operates as a blunt nationwide restraint on adult access to "an enormous amount of valuable information" that individual librarians cannot possibly review. Ante, at 11. Most of that information is constitutionally protected speech. In my view, this restraint is unconstitutional.

I

The unchallenged findings of fact made by the District Court reveal fundamental defects in the filtering software that is now available or that will be available in the foreseeable future. Because the software relies on key words or phrases to block undesirable sites, it does not have the capacity to exclude a precisely defined category of images. As the District Court explained:

"[T]he search engines that software companies use for harvesting are able to search text only, not images. This is of critical importance, because CIPA, by its own terms, covers only 'visual depictions.' 20 U. S. C. §9134(f)(1)(A)(i); 47 U. S. C. §254(h)(5)(B)(i). Image recognition technology is immature, ineffective, and unlikely to improve substantially in the near future. None of the filtering software companies deposed in this case employs image recognition technology when harvesting or categorizing URLs. Due to the reliance on automated text analysis and the absence of image recognition technology, a Web page with sexually explicit images and no text cannot be harvested using a search engine. This problem is complicated by the fact that Web site publishers may use image files rather than text to represent words, i.e., they may use a file that computers understand to be a picture, like a photograph of a printed word, rather than regular text, making automated review of their textual content impossible. For example, if the Playboy Web site displays its name using a logo rather than regular text, a search engine would not see or recognize the Playboy name in that logo." 201 F. Supp. 2d 401, 431-432 (ED Pa. 2002).

Given the quantity and ever-changing character of Web sites offering free sexually explicit material, it is inevitable that a substantial amount of such material will never be blocked. Because of this "underblocking," the statute will provide parents with a false sense of security without really solving the problem that motivated its enactment. Conversely, the software's reliance on words to identify undesirable sites necessarily results in the blocking of thousands of pages that "contain content that is completely innocuous for both adults and minors, and that no rational person could conclude matches the filtering companies' category definitions, such as 'pornography' or 'sex.' " Id., at 449. In my judgment, a statutory blunderbuss that mandates this vast amount of "overblocking" abridges the freedom of speech protected by the First Amendment.

The effect of the overblocking is the functional equivalent of a host of individual decisions excluding hundreds of thousands of individual constitutionally protected messages from Internet terminals located in public libraries throughout the Nation. Neither the interest in suppressing unlawful speech nor the interest in protecting children from access to harmful materials justifies this overly broad restriction on adult access to protected speech. "The Government may not suppress lawful speech as the means to suppress unlawful speech." Ashcroft v. Free Speech Coalition, 535 U. S. 234, 255 (2002)

Although CIPA does not permit any experimentation, the District Court expressly found that a variety of alternatives less restrictive are available at the local level:

"[L]ess restrictive alternatives exist that further the government's legitimate interest in preventing the dissemination of obscenity, child pornography, and material harmful to minors, and in preventing patrons from being unwillingly exposed to patently offensive, sexually explicit content. To prevent patrons from accessing visual depictions that are obscene and child pornography, public libraries may enforce Internet use policies that make clear to patrons that the library's Internet terminals may not be used to access illegal speech. Libraries may then impose penalties on patrons who violate these policies, ranging from a warning to notification of law enforcement, in the appropriate case. Less restrictive alternatives to filtering that further libraries' interest in preventing minors from exposure to visual depictions that are harmful to minors include requiring parental consent to or presence during unfiltered access, or restricting minors' unfiltered access to terminals within view of library staff. Finally, optional filtering, privacy screens, recessed monitors, and placement of unfiltered Internet terminals outside of sight-lines provide less restrictive alternatives for libraries to prevent patrons from being unwillingly exposed to sexually explicit content on the Internet." 201 F. Supp. 2d, at 410.

Those findings are consistent with scholarly comment on the issue arguing that local decisions tailored to local circumstances are more appropriate than a mandate from Congress. The plurality does not reject any of those findings. Instead, "[a]ssuming that such erroneous blocking presents constitutional difficulties," it relies on the Solicitor General's assurance that the statute permits individual librarians to disable filtering mechanisms whenever a patron so requests. Ante, at 12. In my judgment, that assurance does not cure the constitutional infirmity in the statute.

Until a blocked site or group of sites is unblocked, a patron is unlikely to know what is being hidden and therefore whether there is any point in asking for the filter to be removed. It is as though the statute required a significant part of every library's reading materials to be kept in unmarked, locked rooms or cabinets, which could be opened only in response to specific requests. Some curious readers would in time obtain access to the hidden materials, but many would not. Inevitably, the interest of the authors of those works in reaching the widest possible audience would be abridged. Moreover, because the procedures that different libraries are likely to adopt to respond to unblocking requests will no doubt vary, it is impossible to measure the aggregate effect of the statute on patrons' access to blocked sites. Unless we assume that the statute is a mere symbolic gesture, we must conclude that it will create a significant prior restraint on adult access to protected speech. A law that prohibits reading without official consent, like a law that prohibits speaking without consent, "constitutes a dramatic departure from our national heritage and constitutional tradition." Watchtower Bible & Tract Soc. of N. Y., Inc. v. Village of Stratton, 536 U. S. 150, 166 (2002).

Justice Souter, with whom Justice Ginsburg joins, dissenting.

I agree in the main with Justice Stevens, ante, at 6-12 (dissenting opinion), that the blocking requirements of the Children's Internet Protection Act, 20 U. S. C. §§9134(f)(1) (A)(i) and (B)(i); 47 U. S. C. §§254(h)(6)(B)(i) and (C)(i), impose an unconstitutional condition on the Government's subsidies to local libraries for providing access to the Internet. I also agree with the library appellees on a further reason to hold the blocking rule invalid in the exercise of the spending power under Article I, §8: the rule mandates action by recipient libraries that would violate the First Amendment's guarantee of free speech if the libraries took that action entirely on their own. I respectfully dissent on this further ground.

Like the other Members of the Court, I have no doubt about the legitimacy of governmental efforts to put a barrier between child patrons of public libraries and the raw offerings on the Internet otherwise available to them there, and if the only First Amendment interests raised here were those of children, I would uphold application of the Act. We have said that the governmental interest in "shielding" children from exposure to indecent material is "compelling," Reno v. American Civil Liberties Union, 521 U. S. 844, 869-870 (1997), and I do not think that the awkwardness a child might feel on asking for an unblocked terminal is any such burden as to affect constitutionality.

...

[However, we] have to examine the statute on the understanding that the restrictions on adult Internet access have no justification in the object of protecting children. Children could be restricted to blocked terminals, leaving other unblocked terminals in areas restricted to adults and screened from casual glances. And of course the statute could simply have provided for unblocking at adult request, with no questions asked. The statute could, in other words, have protected children without blocking access for adults or subjecting adults to anything more than minimal inconvenience, just the way (the record shows) many librarians had been dealing with obscenity and indecency before imposition of the federal conditions. See id., at 422-427. Instead, the Government's funding conditions engage in overkill to a degree illustrated by their refusal to trust even a library's staff with an unblocked terminal, one to which the adult public itself has no access. See id., at 413 (quoting 16 FCC Rcd., at 8196, ¶ ;30).

The question for me, then, is whether a local library could itself constitutionally impose these restrictions on the content otherwise available to an adult patron through an Internet connection, at a library terminal provided for public use. The answer is no. A library that chose to block an adult's Internet access to material harmful to children (and whatever else the undiscriminating filter might interrupt) would be imposing a content-based restriction on communication of material in the library's control that an adult could otherwise lawfully see. This would simply be censorship. True, the censorship would not necessarily extend to every adult, for an intending Internet user might convince a librarian that he was a true researcher or had a "lawful purpose" to obtain everything the library's terminal could provide. But as to those who did not qualify for discretionary unblocking, the censorship would be complete and, like all censorship by an agency of the Government, presumptively invalid owing to strict scrutiny in implementing the Free Speech Clause of the First Amendment. "The policy of the First Amendment favors dissemination of information and opinion, and the guarantees of freedom of speech and press were not designed to prevent the censorship of the press merely, but any action of the government by means of which it might prevent such free and general discussion of public matters as seems absolutely essential." Bigelow v. Virginia, 421 U. S. 809, 829 (1975) (internal quotation marks and brackets omitted).