A 17-year-old student, Theodore Graubard, took his own life in New York City last Wednesday, and it was only the next afternoon, on February 19, that pranksters on Facebook posted groups mocking the tragic incident.
Even after a flood of friends and fellow-students reporting the inappropriate content to Facebook, the groups remained online for a whole day. Only after a PCMag journalist contacted the company were the groups finally taken down.
All the "Report Group" messages sent by students and others through Facebook itself had no effect (and the school itself also tried to get in touch with Facebook). But a PR request got instant action. That tells us something: if you want Facebook to get around to noticing that someone is flagrantly violating their terms of service, start writing a story about it for a popular, well-respected Web site.
PCMag wrote to Facebook s press e-mail address at 4:25 PM on Friday Feb 20th. The two groups disappeared by 5:05 PM. Facebook representatives wouldn t specifically comment on whether the PR inquiry expedited the removal process, but relayed a statement:
"The two Groups you cited singled out and made fun of an individual, which is a clear violation our policies. They have been removed."
"We investigate all reports we receive, and in cases where content is reported and found to violate the site s policies, Facebook will remove it," Facebook added. "Our team moves as expediently as possible given the volume of activity on our site."
When asked about the time it takes for reported groups to be taken down, the Facebook rep had this to say: "We take all reports we receive seriously, and generally review groups such as the ones you cited within a few days. As you aptly stated, it s hard to speculate exactly when these groups would have been removed, but we do our best to ensure our response time is as expedient as possible within a few days of receiving the initial report. It s also worth noting that reports about nudity, pornography, and harassing messages are reviewed within 24 hours of the report."
It s clear that, with an excess of 140 million users, Facebook must have a massive number of group reports to deal with, but it s a bit daunting to learn that the family and friends of a suicide would normally have to wait "a few days" before the groups mocking the suicide would get shut down. If Facebook wants to be a part of society and expects to be a custodian of their users personal information, they need to do a better, more timely job of policing inappropriate content.