In recent years, two threats have come to dominate the digital-privacy discourse.
The first: government surveillance. Thanks to Edward Snowden’s 2013 disclosures of the National Security Agency’s comprehensive Prism data-collection program, and last month’s Vault7 leaks, which revealed the CIA’s mammoth capacities to do the same through such means as smart TVs and common operating systems, the U.S. public learned it was subject to indiscriminate monitoring by the federal government in the spurious name of “counterterrorism.”
The second: internet service providers’ (ISPs) data harvesting. Companies like AT&T, Verizon, and Comcast have access to their customers’ browsing history, entitling them to vast swaths of user data—including location and even credit or medical history—which they can sell to advertisers. (Though the FCC imposed privacy regulations on ISPs in October 2016, Congress voted in March to repeal these strictures, permitting ISPs to collect data with far fewer restrictions.)
Yet a third, often unheeded danger haunts throngs of citizens: their current and former partners.
Sex. Abortion. Parenthood. Power.
The latest news, delivered straight to your inbox.
This is especially true for survivors of domestic violence. Though the problem of intimate partner abuse has long existed independent of digital technology, mobile and internet-connected devices offer increasingly sophisticated avenues through which to perpetuate and exacerbate it, adding another dimension of exigency to a centuries-long crisis.
Those who’ve endured noxious relationships experience heightened vulnerability to tactics of harassment and assertion of control online, and the means through which abusers can employ these tactics are legion. An abuser may frequently text or call their partner’s phone, for example, using different unknown numbers to circumvent Apple’s and Android’s call-blocking systems. They might ask to view their partner’s computer or phone, seize it under the guise of intimacy or normality, or track their partner’s location through social media. In cases of greater technological literacy, they may install a keylogger (software that records keystrokes, revealing passwords, emails, and anything else typed into a computer), enabling them to stalk their partner.
“Despite the political necessity of protecting against things like the NSA or the FBI and such, on a day-to-day basis, things like online harassment and abuse [are] just far more imminent for most people,” Noah Kelley, a software engineer and creator of the digital self-defense guide DIY Cybersecurity for Domestic Violence, told Rewire. “That creates tangible psychological trauma, economic harm, social harm, career harm.”
Urgency notwithstanding, few resources for this form of protection exist beyond Kelley’s. Though organizations such as Crash Override and Feminist Frequency proffer materials for defense from online harassment (and have been instrumental in publicizing the issue), the cybersecurity and domestic-violence support communities rarely intersect—a dissonance owed partially to the often socially and demographically insular nature of the tech industry.
“I think there is a tendency for people who work in the privacy space to—it’s stereotyping in some ways, but—they tend to be middle-class, they tend to be white, they tend to be male. They tend to live in certain [affluent] parts of the United States. This all colors a certain perception they have of the world,” Sarah Jamie Lewis, a privacy and anonymity researcher who works with LGBTQ communities and domestic violence survivors, told Rewire.
“The people guiding these conversations about cybersecurity and technology are just not really treating this issue as it deserves,” Kelley added. “There are so many times, if there was just one survivor, just one [domestic violence survivor] advocate in the room” when most digital technologies are developed, “this would not be a problem. Just one who had a voice that could be listened to. But that just never happens.”
Additional factors complicate the issue. Domestic violence survivors most commonly turn to shelter staff, volunteers, and social workers, Kelley said, the majority of whom are not only deluged with cases to manage, but also are not professionally trained to advise sufferers on issues of cybersecurity. Many shelters provide advice and links to resources if survivors fear they’re being surveilled, he added, but they often lack the technological expertise to analyze the privacy risks incurred in visiting those sites. (While such organizations as Next Door and the National Network to End Domestic Violence provide more comprehensive instruction—say, how and why to adjust browser-tracking settings—they’re the exception, rather than the rule.)
Furthermore, most strategies recommended to domestic violence sufferers revolve around skirting the defaults of pre-existing structures, resulting in partial, imperfect safeguards, and requiring users to understand labyrinthine technical processes. Internet-connected devices are furnished with immense data-collection faculties; as a result, they immediately place survivors in peril. Shelters may suggest strategies to counteract this, such as clearing personal data stored on phones and computers or using public devices (e.g., computers at libraries or community centers). Similarly, the majority of advice from cybersecurity specialists is reactive, focused on adjusting default settings on devices, browsers, social media, and search engines to minimize traceability.
“Right now, it’s kind of advice not to do things or to do things in certain ways, rather than, ‘Hey, if you use this particular tool, it will do these things, and it won’t do these things, and that could help you,’” Lewis said.
Ideally, she added, applications would instead be consciously designed to immunize vulnerable groups. “My goal in my research is to get us from that point where we’re saying, ‘Use Facebook, but be very careful. Don’t have a profile photo that’s like you. Use a name that sounds real for whatever version of Facebook classifies as real. Put as little personal information on there as possible. Don’t add anyone you used to know who is in a social network that you don’t want to discover you.’ That’s really restrictive, and it’s really hard to empower these communities to live their life while also protecting them from harms they face.”
This intimates one of the most significant underlying issues of online privacy: that the burden of protection is placed on the individual, and not on institutions. Users of the modern internet, a platform through which data is freely exchanged and commodified, are left to fend for themselves if they’re concerned about privacy. The devices, networks, apps, and websites (for instance, Google, YouTube, and Facebook, the three most popular websites in the United States) they use encourage public display of personal information—a function of the fact that, for private tech companies, user data is a product salable to the highest bidder. For domestic violence survivors, this framework renders the task of maintaining any semblance of digital security particularly daunting.
Compounding this is a disquieting reality: Domestic abuse is a profound, complex, and enduring social problem, and cybersecurity is a mere palliative. According to the National Coalition for Domestic Violence, intimate partner abuse affects one in three women and one in four men, and 95 percent of men who physically abuse their partners also psychologically abuse them. Restoring safety in survivors’ daily lives demands discussion, education, activism—far more than a technological Band-Aid.
“Cybersecurity does not solve domestic violence,” Kelley said, “and ultimately, there needs to be a concerted effort to fight abuse and specifically hold men [and abusers of other gender identities] accountable.”