Smut Hunters Fight to Protect Kids on Internet

FRAMINGHAM, Massachusetts -- rowsing the Internet at work, Cathy Gorgens encounters a sexually explicit web site. So she immediately asks her co-workers to take a look.

They all do.

Gorgens' colleagues aren't voyeurs; this is what they do for a living. They are smut hunters for Cyber Patrol, which makes one of the most popular programs for protecting youngsters from the raw, the violent, the hateful and the just plain unpleasant materials that proliferate online at Internet speed.

The market for these screening programs is hot, and getting hotter. A recent poll for the Annenberg Public Policy Center showed that 32 percent of American parents with household online connections now use programs such as Cyber Patrol, CyberSitter, SurfWatch, Web Chaperone and X-Stop.

In the months since the shootings in Littleton, Colorado, new concerns about harmful material on the Internet helped sales of Cyber Patrol jump by 50 percent; they should hit $9 million this year. Action in Washington could spark more growth: Last week the House of Representatives added an amendment to the juvenile justice bill that would require filters on computers in schools and libraries that receive federal Internet subsidies, and a Senate committee is considering a similar measure.

A day spent at the Cyber Patrol offices shows the fine line walked by the smut hunters - who call themselves "Internet researchers." In the "surf room" at Cyber Patrol, a half-dozen staffers rapidly click through screen after screen of hard-core pornography, bomb-making instructions and more. They work without a blink, blush, giggle or gasp, and speak about unspeakable things with the unflappably sardonic tone of those who have seen it all.

The surf room has big windows that look out into the spreading branches of a gorgeous maple; the overhead lights are turned off to avoid screen glare, but the cool sunlight filtering through the leaves suffuses the room with a gentle glow as lurid images flash onto the bank of computer screens.

The adult site Gorgens encountered - a page about genital enlargement techniques - is cause for group discussion. Is it sexually suggestive, or does it fall under the heading of medical information? Since Cyber Patrol allows parents to set various levels of filtering for several surfers (say, allowing partial nudity and sex education but blocking hate-speech sites for a 15-year-old, while blocking any questionable material for a 9-year-old) the discussions are anything but hypothetical.

The categories are complex, and they are set out in a lengthy guide for the staffers. Foot-fetish sites that don't show nudity may be classified as sex talk but won't be grouped with depictions of sex. And after a group of nudists appealed to the company's advisory board, nudist sites were taken out of the sexual categories.

The site Gorgens found, however, leads to an almost Talmudic discussion. The subject has made its way into popular culture; a device for genital enlargement was a plot element in the first "Austin Powers" movie, and a host of plastic surgeons perform procedures that promise the same effect.

A legitimate medical site might pass the reviewers' muster. But this site, with a startlingly explicit photograph and sexually oriented commentary, leads Gorgens to put it under the category of "sex talk." And, as she does many times a day, Gorgens brings up a special window that adds this spot on the Web to a "Cyber-NOT" list of 100,000 sites, arranged by category. She prints out a copy of the web page for the company's files. The staffers add as many as 1,200 sites to that list each week, part of a Sisyphean undertaking that has taken the company through an estimated 10 million pages online.

Why hire people to look at this digital detritus, rather than rely on software to do the job? Companies like Cyber Patrol and SurfWatch employ live reviewers because they believe that only the human brain has the capability to recognize patterns and make distinctions - such as the difference between a medical site and a sex site - and to confer over the tough calls. Other companies, such as Web Chaperone and X-Stop, tout software-based solutions and say that the raw speed of the automated systems outstrips the abilities of flesh-and-blood surfers.

Cyber Patrol has always hired as its researchers parents and educators, people who most closely reflect the target audience. The workers, who say they enjoy the flexible hours and spirit of camaraderie, maintain that their daily dose of porn doesn't affect their private lives. "I will admit at first thematerial was 'interesting,'" says researcher John Borden, who adorns his desk with pictures of his four kids. "But it's my job, so I take more of a professional approach toward the topic. ... My job and my sex life are two different things."

Borden says his daughter went online looking for "teen furniture" one night and ended up with a list of about 20 sex sites - all blocked, he notes with pride, by Cyber Patrol. "Teen" is one of the words that can be counted on to bring up porn sites in an Internet search. So are "Barbie," "Disney," "Lewinsky" and, for obscure reasons, "puffy." And then there are the obviously prurient words; these are listed on a white board that the staffers check regularly in their searches for the pornography, "bomb-making guides" and more that they are trying to root out.

The staffers say that people who hear about the program's database of 100,000 restricted sites often ask them, "How can I get that list?" It's not for sale, says Susan Getgood, who runs Cyber Patrol for its corporate parent, The Learning Company (which is in turn owned by toymaker Mattel Inc.). "We're not in the business of producing directories of pornography," Getgood says.

Following one link that advertises "bomb-making instructions," Borden finds that the site is actually devoted to news about Lithuania; the bomb talk was just a come-on for the curious. The site is not blocked.

Another site, however, is full of instructions for making bombs using light bulbs, tennis balls, even animal blood. "Just the sort of thing to do when you have too much animal blood lying around," the site explains.

It's blocked.

An "online gun show" where visitors can view, buy and sell firearms.


12 Live shows! 24 hours a day!! 1500 streaming videos!!!


Cyber Patrol and its ilk are regarded with distrust by many civil libertarians. The quality of such programs varies widely - some block innocuous sites and important health information - and few buyers can tell the good from the bad. The core of what makes most of the programs work - the lists of prohibited sites - is closely guarded as a trade secret.

Web sites with names like Censorware and Peacefire decry the limitations imposed by the screening programs and give visitors tips on how to disable the software.

More mainstream civil liberties groups have no problem with software that helps parents impose limits on what their children find on the web, but they oppose laws that would mandate the use of filtering in federally supported schools and libraries. Congress is currently considering two such bills.

The officials at Cyber Patrol don't favor the concept of forcing schools and libraries to use their products, either, even though it would likely boost profits. "Why would one legislate what's already happening?" Getgood asks. "Cyber Patrol has always been about providing choices for parents and for schools about how they're going to use the Internet."

Nancy Willard, a University of Oregon researcher conducting a study of how high school students make decisions about legal and ethical activities online, says parents and institutions expect too much of filtering software. "The problem is we've got kids speeding up and down the information superhighway and parents who don't know how to turn the key." Parents have to become more savvy users of the Internet themselves, she says. She argues that the programs can help protect a younger child from stumbling on objectionable material but can't stop a teenager who wants to view it.

Getgood agrees.

"What we want to do is provide tools that supplement what people do and not try to replace" parental influence, she says.

Instead, Willard insists, filtering has to be considered part of a larger set of questions about the ethical use of computers - which includes discussions of hacking, plagiarism and the rest of the spectrum of evil. And that is a discussion that begins in childhood. Parents have taught these lessons before, Willard says, beginning with "teaching kids to cross the street."

"We know the process of teaching kids to make safe choices. ... What we have to do is translate that process into making choices about cyberspace," she says.