Red Hood Project fights to protect kids from predators on social media
Anonymity is a powerful protector in the online world, but it can also be a powerful weapon against the innocent.
Meanwhile, a deadly blow was struck against Freedom Hosting, a denizen of the Deep Web (those dark areas online not indexed by standard search engines) which harboured numerous child-porn sites, when pedofile Eric Eoin Marques was arrested in Ireland. Anonymous, itself no stranger to Guy Fawksian anonymity, has long crusaded against Freedom Hosting via Operation Darknet.
Liability as medicine
Garossino argues that direct government micromanagement isn't even needed in social media.
"[There's already] a brilliant instrument for achieving these kinds of ends. It's called liability."
Today's Internet-technology industry just doesn't speak the language of product liability, especially since, in social-media terms, the user herself is the product. Her eyeballs and social connections farmed out to advertisers are the driving economic engine of the machine.
Section 230 of the Telecommunications Act of 1996 has shielded web publishers for the most part from prosecution over actions performed by their end users. The exact language is:
No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.
That was passed 17 years ago, when social media as we know it was not yet a glint in a developer's eye. The sheer volume of user-to-user interaction is far more than any law-enforcement body could ever hope to handle, even if it wanted to.
The Red Hood Project wants to kick the liability back to social media companies themselves precisely for the protection of privacy.
To reference an extreme case, a baby-monitoring system called Foscam was recently hacked by an online predator, who started shouting sexualized insults at a two-year-old girl in Houston, Texas. The onus wasn't on the parents, but on Foscam itself for having a weak security system that exposed small children to harm.
As Facebook's action against rape pages showed, financial pressure was not just a powerful motivator, but the catalyst needed for the social media giant to prove that it could indeed wrangle its own content if it wanted to.
Solutions need to start, now
Garossino applauds British Prime Minister David Cameron's actions regarding ISP filtering and pressuring Google to go harder on filtering out child-porn-related search results, despite the blowback coming from web-freedom circles on both philosophical and procedural points.
As Wikipedia co-founder and Cameron adviser Jimmy Wales told Channel 4,
"It's an absolutely ridiculous idea that won't work. [...] You realize all that Cameron's plans would do is require [pedophiles] to opt in and say, 'Yes, I'd like porn, please.' It does nothing to stop criminals."
But someone has to jump first, says Garossino.
"The essence of our argument is corporate responsibility to design safety for kids into online products such as apps, hardware, and social media sites that are accessible to them," she explains. "That could come in the form of liability for damages, or CRTC stepping up its oversight of online products used by minors, or it could be industry self-regulation.