When the Supreme Court ruled last month that public libraries must install anti-pornography filtering software on their computers as a condition of federal funding, child safety advocates called it a landmark decision for the rights of children. The decision also applies to public schools although that portion of the law wasn’t challenged.
For Congress, it was a breakthrough decision after two previous attempts to protect children from online smut were rejected by the courts as unconstitutional.
For free speech advocates, it was a historic decision of another sort: the
Supreme Court has never previously upheld an effort to regulate content on the Internet.
Even worse, according to the American Library Association (ALA) and the
American Civil Liberties Union (ACLU), the decision was based on the faulty premise that filters work.
The Supreme Court ruling overturned a federal appeals decision that rejected the Children’s Internet Protection Act (CIPA)as a violation of the First Amendment. The lower court ruled that the use of filtering software in public libraries blocked access to Web sites that contained substantial amounts of protected speech.
In other words, filters don’t always work.
Although CIPA specifically stipulates adults can request a librarian to turn off the anti-porn filters, the lower court said library patrons might be too embarrassed or lose their right to be anonymous.
The Supreme Court, though, ultimately ruled the government’s interest in protecting children from exposure to sexually inappropriate material outweighed the rights of adult library patrons.
The Court did agree with the lower court that filtering software, at best, is problematic.
“Findings of fact clearly show that filtering companies are not following legal definitions of ‘harmful to minors’ and ‘obscenity,'” the ALA said in statement following the Supreme Court ruling.
The idea of filters sounds ideal in a Star Treky kind of way: Computer, block sexually inappropriate material. Unfortunately, humans have to program the computer to recognize what is, or isn’t, appropriate material. Most filters block pages by weeding out sites containing a pre-determined set of words.
Justice Potter Stewart once famously said he couldn’t define obscenity but he knew it when he saw it. Since then, the courts have actually defined what is obscenity. Sort of.
For material to meet obscenity standards and lose its First Amendment protection, three standards have to be met: (1.) the average person, applying contemporary community standards would find that the work, taken as a whole, appeals to the prurient interest; (2.) the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state law; and (3.) the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.
When minors are the viewers, a lower and less demanding standard may be used.
Child pornography, which can not be legally viewed by any person, usually is defined as children involved in “sexually explicit conduct.” Virtual child pornography — computer generated images of children engaged in sexually explicit acts — is protected by the First Amendment.
All in all, that’s a lot to ask a piece software to filter.
According to the Electronic Frontier Foundation, “the immense size and variablity of the Internet raises concerns as to whether it is possible to limit Internet blocking only to Web pages containing legally ‘blockable’ content.”
A recent study by the EFF and the Online Policy Group examined the effects of N2H2 and SurfControl’s filtering software, two of the popular products on the market. The study involved Internet searches of text taken directly from the state-mandated curriculums of California, Massachusetts and North Carolina.
Testing nearly a million Web pages, the study found that for every page blocked as advertised, the software blocked one or more pages inappropriately either because the pages were miscategorized or because the pages, while correctly categorized, did not merit blocking.
In case of block codes used in compliance with CIPA, the blocking software miscategorized 78-85 percent of the sample.
The study concluded that blocking software either overblocks or underblocks. The software either blocks access to many pages protected by the First Amendment or does not block pages likely to be prohibited under CIPA.
A Kaiser Family Foundation study conducted last year said that Internet filters most frequently used by schools and libraries can effectively block pornography without significantly impeding access to online health information, but only if the filters are set at the lowest, least restrictive levels.
As filters are set at higher levels they block access to a substantial amount of health information, with only a minimal increase in blocked pornographic content, the report stated.
The ALA is dealing with the filtering issue by calling for full disclosure of what sites
filtering companies are blocking, who is deciding what is filtered and what
criteria are being used.
The group hopes to obtain this information and then evaluate and share the data
with the libraries now being forced to forego funds or choose faulty filters. The ALA believes library users must be able to see what sites are being blocked and, if needed, be able to request the filter be disabled with the least intrusion into their privacy and the least burden on library service.
If filters are faulty, are there alternatives? Spyware, software and hardware that records keystrokes and Web sites visited, is sometimes mentioned as an alternative to filters, but its potential for misuse has made librarians wary.
Libraries could certainly use spyware to determine who is viewing inappropriate or indecent materials in addition to tracking illegal activities, such as cyberstalking, hacker attacks or using stolen credit cards. However, unlike filters, spyware is not pre-emptive. It only takes note of what site was visited after the fact.
More than 90 percent of the public libraries in the U.S. offer Internet access. Last year, public libraries received almost $60 million in federal funding to buy Internet access and nearly $150 million in technology grants.
With many states facing budget deficits, it is not likely many libraries will choose not to receive federal funds in order to avoid filters. What is likely under CIPA, though, is the very real possibility that libraries will now offer a censored Internet that goes far beyond blocking pornography.