Most sex crimes against children are committed by people the children know, rather than strangers.
Sulake said it had kept 225 moderators and is still investigating what went wrong.
By some measures, Internet-related sex crimes against children have always been rare and are now falling (as are reports of assaults on minors that do not involve the Net).
Another pillar in Facebook's strategy is to limit how those under 18 can interact on the site and to make it harder for adults to find them.
Minors don't show up in public searches, only friends of friends can send them Facebook messages, and only friends can chat with them.
Users could be unnerved about the extent to which their conversations are reviewed, at least by computer programs.'We've never wanted to set up an environment where we have employees looking at private communications, so it's really important that we use technology that has a very low false-positive rate,' he said.
In addition, Facebook doesn't probe deeply into what it thinks are pre-existing relationships.Under a 1998 law known as COPPA, for the Children's Online Privacy Protection Act, sites directed at those 12 and under must have verified parental consent before collecting data on children.Some sites go much further: Disney's Club Penguin offers a choice of viewing either filtered chat that avoids blacklisted words or chats that contain only words that the company has pre-approved.Filters and moderators are essential for a clean experience, said Claire Quinn, safety chief at a smaller site aimed at kids and young teens, Wee World.But the programs and people cost money and can depress ad rates.The looser the filters, the more the need for the most sophisticated monitoring tools, like those employed at Facebook and those offered by independent companies such as the UK's Crisp Thinking, which works for Lego, Electronic Arts, and Sony Corp's online entertainment unit, among others.