Power

Anti-Trafficking Bill Would Create More Problems Than It Would Solve

Chilling effects on free speech will hit already vulnerable populations especially hard.

[Photo: A person uses a smaertphone while standing near a street at night.]
Deprived of access to information about their rights or the means of discovering and reaching out to organizations that exist to help them, trafficking victims will suffer too. Shutterstock

Politicians have been saying that they want to do something about trafficking for 20 years in the United States, and they have passed laws and continue to do so. Something must be done, goes the politician’s saying.

Bill S 1693, known as the Stop Enabling Sex Traffickers Act, or SESTA, looks very much like one of those “somethings.” However, the bill is more likely to hamper innovation by tech companies—the primary driver of recent U.S. economic growth—than to effectively address trafficking.

This is something, go the politicians. Therefore this must be done.

Human trafficking is an abhorrent crime. No politician wants to be seen opposing a measure that claims to combat it. So, inevitably, SESTA has gathered wide support among legislators; the House version of the bill has already passedYet the bill is problematic. Currently under consideration by the Senate, SESTA aims to hold social media companies and other online businesses accountable for activity related to trafficking taking place on their platforms. Despite its good intentions, it is more likely to harm more than help victims of trafficking and other vulnerable people.

Human rights activists do want politicians to do something about trafficking rather than create laws that hamper innovation and free speech as we live more of our lives online. But thus far, the drafters of SESTA have not explained what value it adds.

Today, no provider of an interactive platform or service is treated like a publisher or speaker of any user-generated content. Users are responsible for their content, not companies. SESTA would change this so that tech companies would become responsible for their users’ posts. In common-sense terms, if you and I both use Facebook and I post something hateful or illegal, you are in no way responsible for what I said. More to the point, neither is Facebook. The law today allows a provider to make good-faith efforts to remove material it considers objectionable at its own discretion, but does not require it. SESTA would mandate it.

But SESTA also introduces the possibility of “civil remedy” suits against providers by victims of trafficking, as well as civil actions by state attorneys general. Standards of proof in civil cases are lower than in criminal, and wealthy (or not-so-wealthy) tech companies make tempting targets for lawsuits seeking exemplary damages.

This is a game-changer for technology companies. Before, they were largely protected against penalties for the misbehavior of their users, through common-sense provisions. Now, SESTA threatens to paint a billion-dollar target on their backs.

So what’s a technology company to do? The answer is to self-censor, to throw anything off their platform that might conceivably expose them to liability. But their task is nearly impossible. There’s simply no way to decide whether any given piece of content in the torrent of material flowing through a social media site might or might not violate this new statute. What if that dating ad was placed by a trafficker? What if a strip club—a legal business—advertised or hosted on the platform turns out to be a venue for exploitation by trafficking victims? Cases will hang on whether the provider “knew,” yet in my own interviews with trafficked persons in New York City I learned that some had been arrested up to ten times before they were finally identified as having been trafficked. If officers who have been trained to recognize trafficked persons can repeatedly overlook victims when they are right in front of them, what chance does a technology company have of getting it right?

Faced with this impossible task, technology companies will err on the side of extreme caution. Here’s an idea of what this might look like: When the U.S. Department of Justice implemented a program that isolated sex workers from financial services, PayPal seemed to shut down all accounts belonging to suspected sex workers. Internet trolls will have a field day being able to shut down all the social media accounts of an outspoken feminist or a sex worker advocate by reporting her as a possible trafficker or trafficked person. What company would take the risk of keeping her accounts open?

Consequently, consenting adult sex workers, even those involved in legal work such as stripping or pornography, will be further marginalized as their online presence is closed down, depriving them of the ability to network or advocate for their rights. They will be put at increased risk: A UK study showed that sex workers who used online platforms were better able to protect themselves from violence than those who did not.

As another Rewire.News piece argued, when risk-averse platforms shut down any exchange of information that might constitute a potential liability, sex workers lose a key tool for protecting themselves against violent clients. Deprived of access to information about their rights or the means of discovering and reaching out to organizations that exist to help them, trafficking victims will suffer too. In fact, any group that might be thought to have the most tenuous connection to sex work and thence to trafficking—including LGBTQ groups and online sex educators—runs the risk of being muzzled.

SESTA sets technology companies an impossible task, urging them to ever more extreme acts of self-censorship. The chilling effects on free speech will be felt most sharply by already vulnerable populations. Nor is there any reason to believe that it will be particularly effective in combating trafficking. But to politicians, something must be done, so they’re doing it. It’s up to us to try to stop them.