Politicians have been saying that they want to do something about trafficking for 20 years in the United States, and they have passed laws and continue to do so. Something must be done, goes the politician’s saying.
Bill S 1693, known as the Stop Enabling Sex Traffickers Act, or SESTA, looks very much like one of those “somethings.” However, the bill is more likely to hamper innovation by tech companies—the primary driver of recent U.S. economic growth—than to effectively address trafficking.
This is something, go the politicians. Therefore this must be done.
Human rights activists do want politicians to do something about trafficking rather than create laws that hamper innovation and free speech as we live more of our lives online. But thus far, the drafters of SESTA have not explained what value it adds.
Roe has collapsed in Texas, and that's just the beginning.
Stay up to date with The Fallout, a newsletter from our expert journalists.
Today, no provider of an interactive platform or service is treated like a publisher or speaker of any user-generated content. Users are responsible for their content, not companies. SESTA would change this so that tech companies would become responsible for their users’ posts. In common-sense terms, if you and I both use Facebook and I post something hateful or illegal, you are in no way responsible for what I said. More to the point, neither is Facebook. The law today allows a provider to make good-faith efforts to remove material it considers objectionable at its own discretion, but does not require it. SESTA would mandate it.
But SESTA also introduces the possibility of “civil remedy” suits against providers by victims of trafficking, as well as civil actions by state attorneys general. Standards of proof in civil cases are lower than in criminal, and wealthy (or not-so-wealthy) tech companies make tempting targets for lawsuits seeking exemplary damages.
This is a game-changer for technology companies. Before, they were largely protected against penalties for the misbehavior of their users, through common-sense provisions. Now, SESTA threatens to paint a billion-dollar target on their backs.
So what’s a technology company to do? The answer is to self-censor, to throw anything off their platform that might conceivably expose them to liability. But their task is nearly impossible. There’s simply no way to decide whether any given piece of content in the torrent of material flowing through a social media site might or might not violate this new statute. What if that dating ad was placed by a trafficker? What if a strip club—a legal business—advertised or hosted on the platform turns out to be a venue for exploitation by trafficking victims? Cases will hang on whether the provider “knew,” yet in my own interviews with trafficked persons in New York City I learned that some had been arrested up to ten times before they were finally identified as having been trafficked. If officers who have been trained to recognize trafficked persons can repeatedly overlook victims when they are right in front of them, what chance does a technology company have of getting it right?
Faced with this impossible task, technology companies will err on the side of extreme caution. Here’s an idea of what this might look like: When the U.S. Department of Justice implemented a program that isolated sex workers from financial services, PayPal seemed to shut down all accounts belonging to suspected sex workers. Internet trolls will have a field day being able to shut down all the social media accounts of an outspoken feminist or a sex worker advocate by reporting her as a possible trafficker or trafficked person. What company would take the risk of keeping her accounts open?
Consequently, consenting adult sex workers, even those involved in legal work such as stripping or pornography, will be further marginalized as their online presence is closed down, depriving them of the ability to network or advocate for their rights. They will be put at increased risk: A UK study showed that sex workers who used online platforms were better able to protect themselves from violence than those who did not.
SESTA sets technology companies an impossible task, urging them to ever more extreme acts of self-censorship. The chilling effects on free speech will be felt most sharply by already vulnerable populations. Nor is there any reason to believe that it will be particularly effective in combating trafficking. But to politicians, something must be done, so they’re doing it. It’s up to us to try to stop them.