News
HMS Is Facing a Deficit. Under Trump, Some Fear It May Get Worse.
News
Cambridge Police Respond to Three Armed Robberies Over Holiday Weekend
News
What’s Next for Harvard’s Legacy of Slavery Initiative?
News
MassDOT Adds Unpopular Train Layover to Allston I-90 Project in Sudden Reversal
News
Denied Winter Campus Housing, International Students Scramble to Find Alternative Options
The gasp of disbelief across the Internet was almost audible Oct. 11, when Texas Gov. George W. Bush announced in the second presidential debate that the Web had caused the Columbine killings. In opposing restrictions on gun sales, Bush noted that today's culture allowed a child to "have their heart turned dark as a result of being on the Internet and walk in and decide to take somebody else's life." Vice President Al Gore '69, gloomy and brooding in his second-debate coma, failed to challenge this position, and both candidates later seemed to agree with a measure in Congress to place Internet filtering software on all federally-funded computers in public schools and libraries.
Although the tech community may recoil at such a proposal--and at Bush's remarkable denial of personal volition--the plan makes political sense. It takes what most people see as a technological dragon and promises that it will be slain by a technological St. George, filter software that can scan through the Internet and block access to anything objectionable--pornography, hate speech, bomb-making instructions, you name it. Since it's a technical issue, the computer experts will handle it, and the electorate can go back to sleep. Unfortunately, the problem isn't that simple: Filter software is beset by fundamental problems whose solution will require tough thinking, not by programmers, but by the public at large.
Despite the promises, the technology for filtering simply isn't there. All of the currently available Internet filtering programs block perfectly legitimate information. Error rates have ranged as high as 60 percent as filters prohibit access to information on breast cancer or sexually transmitted diseases. The American Family Association, a conservative religious group, was once blocked as as promoting anti-gay hate speech, and the American Civil Liberties Union, CNN and Time magazine have all been blocked by various filters for discussing the issue of Internet pornography. Many programs have blocked the websites of their competitors. Sometimes the blocks seem completely incomprehensible, as in the case of the Latin text of St. Augustine's "Confessions"--which, although it describes Carthage as "a cauldron of unholy loves," could hardly be described as titillating.
Yet in addition to blocking too much, the filter programs also generally block too little. The Internet is a rapidly changing environment, in which new sites emerge and disappear on a daily basis. No team of investigators could possibly catalog all the objectionable material to be found there--and even if they could, they'd need to start over in a week. Furthermore, Internet gateways and proxy servers allow for enterprising individuals to evade many filter programs. Until we can develop artificial intelligence more astute than Justice Potter Stewart (who couldn't define pornography, but said, "I know it when I see it"), the programs' inability to fill their claims will not represent mere technical problems to be solved in the next version.
Such problems might be mere annoyances if schoolteachers and librarians exercised actual control over what would be blocked. However, the only trade secret of a filter company is its list of banned sites; if that were public, any competitor could introduce an equally good product. As a result, the lists are fiercely guarded, meaning that there is no simple way for the purchasers of filter software to verify a manufacturer's claims--or for those whose sites are wrongly labeled as pornography or hate speech to find out and complain. Even if the programs offer some nominal degree of choice, users have no ability whatsoever to select the exact level of filtering. Filter software puts these decisions in the hands of the manufacturers, not those of the schools or the parents.
In light of these difficulties, filter software has the added disadvantage of being unnecessary. I am certain that no filter in the world would be a greater deterrent than the disapproval of Mrs. Beamer, the school librarian, when she sees a student with pornography on the screen. More generally, in today's society, those under 18 are exposed to a great number of influences outside the home; parents who have not prepared their children to face such influences without letting their hearts be "turned dark" cannot expect society to shield their virgin ears. The federal government does not need to interfere, especially because serious constitutional issues can arise when public libraries try to use such error-filled software to block materials that minors have the right to see--a practice a federal judge in Virginia compared to buying an encyclopedia and then blacking out everything deemed inappropriate. One need only remember how the American Family Association was categorized as hate speech to realize the implications of such government mandates.
But politically, none of this matters. Filter software has become the cause du jour because it lays to rest nagging questions of content and responsibility--it is just a cover for our actual confusion, a whited sepulcher concealing the ugly fact that society hasn't yet dealt with the Internet's implications. In the old days, a flawed culture could be healed by dragging a bunch of network executives up to Capitol Hill and giving them a stern talking-to; through the democratized medium of the Internet, content can be distributed worldwide after it has been reviewed by only one moderator in a newsgroup--or even no one at all. ("The horror! The horror!") To deal with this new development, our irrational fear of the Internet--and our irrational faith in the power of a technological quick fix--must both be discarded. When faced with such a powerful medium, society simply can't afford to let computers make all the decisions. Instead, we'll have to.
Stephen E. Sachs '02 is a history concentrator in Quincy House. His column appears on alternate Tuesdays.
Want to keep up with breaking news? Subscribe to our email newsletter.