Censorship: OK to fight copyright infringement but not sex abuse?

Even child prostitution, it seems, isn’t a good enough reason to force internet service providers to monitor the content they publish. Citing conflicts with the Communications Decency Act, as well as various constitutional conflicts (including the First Amendment), a federal judge recently issued a preliminary injunction barring the state of Washington from enforcing a law that would force services like Backpage.com to personally verify the age of individuals offering sexual services in classified ads. The law would have made sites like Backpage.com, a notorious marketplace for sex sales, criminally responsible if their ads led to sexual abuse of a minor.

The decision is probably the right one given the language of the law but this case should serve as a sign that something’s wrong in how we prioritize online content. Why does the entertainment industry get a stick with which to beat web sites while child prostitutes are left empty-handed?

Laws should encourage free speech online

Whether we’re talking about classified ads, obscene material or copyright, the arguments on both sides are generally the same. Those who propose laws typically see a criminal enterprise and claim it’s time to regulate the platforms that enable these crimes to take place. Opponents say the bills will have unduly burdensome effects on providers by forcing them to monitor every piece of content that hits their servers. Alternatively, they say, such bills will chill free speech by encouraging providers to limit drastically the types of content they host in order to avoid the burden of monitoring.

The last time a proposed law — the Stop Online Piracy Act — tried to force web sites and service providers to monitor content proactively, the companies and web users it would have affected reacted fiercely. They were so outraged they blacked out parts of the web and launched crowdsourced movements to write new internet constitutions and influence internet policy. Fair enough.

Due to the nature of how the internet and web operate, it’s easy to side with those wanting to protect providers and web sites. Unless they’re actively encouraging the criminal behavior that laws want to regulate, it’s difficult to hold sites and services accountable for the activity (and complaints) of potentially millions of users.

Hence Section 230 of the Communications Decency Act, which generally exempts service providers from liability for user-provided content, even when providers are notified such content might be obscene or otherwise illegal. This is the statute on which the judge in the aforementioned case — brought by Backpage.com and the Internet Archive — centered his to decision to suspend implementation of Washington’s anti-child-prosititution law. (For a good explanation of the extent of this immunity from liability, and a lengthy hypothetical application to Wikipedia content, check out this 2006 article from the Harvard Law Review.)

Unless we’re talking about copyright

However, as anyone even casually aware of the Digital Millenium Copyright Act might be aware, not all content is created equal. That act’s widely cited “safe harbor provision” actually restricts a great deal of the immunity the CDA would normally provide sites like YouTube against claims of copyright infringement. In fact, the CDA expressly excludes intellectual property law from the scope of its coverage.

Under the DMCA, when service providers receive notice of allegedly infringing content, they must either undertake the effort to determine whether it’s legally infringing or just take the content down until/if the user who posted it rebuts the purported content-holder’s claim. This process can be terribly burdensome on service providers that don’t simply want to act as a rubber stamp for censorship by removing whatever content is contested. Indeed — as Google has showed time and time again — there are a lot of false, or at least questionable, claims filed under the DMCA.

If it works for copyright, why not prostitution?

It’s difficult to comprehend why it’s acceptable to impose burdens on service providers and potentially chill free speech in the name of preventing copyright infringement, but not in the name of preventing prostitution. Why should Facebook, for example, be forced to act upon a claim about someone posting a video without permission but not about someone trying to sell a minor for sex?

To be clear, the proposed law in Washington might be a bit extreme in all but requiring service providers to attempt to verify in person the ages of the advertised escorts. For a variety of reasons — including the global nature of the web and questions about jurisdiction — this is probably infeasible. The Washington law is also far too broad, potentially covering everyone from Backpage.com (the lead plaintiff in the case) to co-plaintiff and intervenor the Internet Archive.

It might not be infeasible, however, to require sites and service providers to examine somehow claims of child prostitution like they do copyright claims. (I suspect many already do in some cases, and almost all web site terms of service grant them the permission to remove objectionable content.) And if laws were rewritten to cover only sections on sites that advertise “escort services” or other clear euphemisms for prostitution, that’s certainly less burdensome than imposing requirements across every piece of content on the web.

Regulating content on the internet is a complex issue and attempts to do so in a meaningful manner often skirt the bounds of what’s constitutional. It’s unclear what methods for fighting a problem such as child prostitution would be both effective and legal. But it’s also debatable that the DMCA is a fair or effective law. If Congress thinks it’s alright to suspend concerns about free speech when it comes to the background song in a YouTube video, maybe doing so for allegations of child abuse isn’t such a crazy idea.

Image courtesy of Shutterstock user Rugierro S.



GigaOM