Internet Filtering: An Interpretation of the Library Bill of Rights

Many schools and libraries use online content filters for different reasons. The most common reason is to comply with the Children’s Internet Protection Act (CIPA), which allows them to receive federal funding or discounts through:

  • the Library Services and Technology Act (LSTA);
  • Title III of the Elementary and Secondary Education Act; or
  • the Universal Service discount program (E-rate).

Some libraries and schools also follow state filtering rules to qualify for state funding. Many believe that having filtered access is better than having no access at all. The American Library Association (ALA) does not recommend the use of internet filters. When they are used, libraries should adopt practices and policies to reduce their negative impacts.

CIPA requires public libraries and schools that receive E-rate discounts to use content filters. These filters must block two categories of images that are not protected by the First Amendment: obscene images and child sexual abuse material. The Supreme Court has ruled that these types of content do not have constitutional protection.

CIPA also requires libraries and schools to block a third category—images that courts deem “harmful for minors.” These images are legal for adults but not for minors. CIPA does not require blocking any other constitutionally protected speech or images.

Research shows that filters often block too much or too little content. They frequently prevent access to a wide range of legally protected speech for multiple reasons. Filtering technology often struggles to interpret and categorize complex human communication, whether in text or images, or video. Filters can be ineffective in handling new types of content because they lag behind the always changing and growing internet.[i] Also, filter settings may be set by administrators in ways that go beyond the legal requirements and restrict access to legally protected speech.[ii]

Using filters shifts the control of important library and school resources to outside companies. These third parties can influence basic library functions without accountability.[iii] Also, library workers and educators often find that filters are unreliable and tech-savvy users can easily bypass them. Some filters allow local library workers to modify settings. Other filters require approval from higher-level administrators.

Most content filters are designed for larger markets beyond libraries and schools. They often block broad categories of legally protected speech, such as:

  • objectionable language;
  • violence;
  • unpopular or controversial opinions; and
  • entire online services like email and social media.

Many filters operate on an “opt-out” model, meaning they are turned on by default. These filters often default to the strictest settings, which can only be changed by an administrator.

The Supreme Court upheld CIPA requirements in public libraries based on the idea that adults could request filters be turned off.[iv] However, in practice, this often doesn’t happen. Some libraries are unwilling or unable to unblock content when requested. This is especially true if system administration is outside of the library’s control or when staff are unsure of what content is protected for adults. This can lead to long delays in unblocking mistakenly filtered content.

This same issue happens in schools. Delays in unblocking content function as a form of censorship. This is because most users don’t have the time to wait hours or days for access. This problem is made worse by the filtering industry’s secrecy about how content is categorized and blocked, often protected as trade secrets.

Filters also raise privacy concerns. Users must identify themselves and their interests when asking for specific websites to be unblocked. This can discourage both adults and minors from accessing information on personal or controversial topics. No one should have to give up their privacy to access information that should be freely available.

In schools, the CIPA rules are often misinterpreted. This leads to:

  • overly strict filtering that blocks legally protected content;
  • preventing educators from using the internet for teaching; and
  • stopping students from accessing content for assignments and personal interests.

Students may also be blocked from using interactive websites and social media for learning. This limits opportunities to create and share original content like documents, videos, and music.

These issues arise when library workers and educators are not involved in setting filtering policies and procedures. Minors, educators, and library workers should not be blocked from legally protected content just because some may find it objectionable. Anyone should be able to request that non-CIPA-required content be unblocked.

CIPA-mandated content filtering has several major impacts on schools and libraries:

  1. It widens the digital divide between those who can afford personal internet access and those who must rely on filtered, publicly-funded access.
  2. Minority viewpoints, religions, or controversial topics are often labeled as objectionable or offensive and blocked.
  3. Filters can reinforce bias and discrimination by restricting access to certain content.

When filters block too much content, people without other internet access are left with only what the unreliable filters allow.

The negative effects of internet content filters are well documented. Because of this, the ALA cannot recommend using filters.[v] However, the ALA recognizes that many libraries and schools rely on federal or state funding for internet access and may be required to use them.

Libraries and schools that use filters should have policies to reduce their negative impact. They should make it easy for users to request blocked content to be unblocked. These requests should be carried out with minimal delay and with respect for user privacy.

For further guidance, see “Guidelines to Minimize the Negative Effects of Internet Content Filters on Intellectual Freedom.”[vi]

Notes

[i] Dhanaraj Thakur and Emma Llansó, “Do You See What I See? Capabilities and Limits of Automated Multimedia Content Analysis,” Center for Democracy & Technology, May 2021.

[ii] Elizabeth Laird, Madeleine Dwyer, and Hugh Grant-Chapman, “Off Task: EdTech Threats to Student Privacy and Equity in the Age of AI,” Center for Democracy and Technology, September 2023, https://cdt.org/wp-content/uploads/2023/09/091923-CDT-Off-Task-web.pdf.

[iii] Kristen Batch, “Fencing Out Knowledge: Impacts of the Children’s Internet Protection Act 10 Years Later,” OITP and OIE Policy Brief No. 5, American Library Association, June 2014, https://www.ala.org/sites/default/files/advocacy/content/intfreedom/censorshipfirstamendmentissues/FINALCIPA_Report.pdf.

[iv] United States v. Am. Library Association, Inc., 539 U.S 194 (2003).

[v] “Resolution on the Use of Filtering Software in Libraries,” adopted 1997 by the ALA Council; and “Resolution on Opposition to Federally Mandated Internet Filtering,” adopted 2001 by the ALA Council.

[vi] “Guidelines to Minimize the Negative Effects of Internet Content Filters on Intellectual Freedom,” https://www.ala.org/advocacy/intfreedom/filtering/filtering_guidelines.


Adopted June 30, 2015, by the ALA Council; amended June 29, 2025.