We are all connected all the time.

An argument for privacy as mutual aid.

by MW.

Terms:

Social Network: I am not using this term to refer to a specific social networking platform (i.e. Facebook or Twitter). I am using it in its broader sense, to mean a network of people who have interpersonal relationships with one another, who talk to each other and who interact with each other.

Mutual aid: Mutual aid means the cooperative and reciprocal distribution of services and resources for mutual benefit. In other words, people supporting each other at the same time. Mutual aid means we help each other instead of waiting for a charity, government or outside entity to help us. Mutual aid is based on sharing the abundance we have, and it directly counteracts the scarcity mindset which is fundamental to the maintenance of capitalism.

Privacy: this may seem like an obvious one, but I think it’s important to distinguish the specific way I am using the word privacy in this context. Privacy means the ability to keep information and data out of the reach of anybody who is not specifically granted access. Having privacy means that you get to decide who knows what about you.

Anonymity: Anonymity means a person’s identity is not disclosed or known. Anonymity does not guarantee privacy, in fact anonymity is often used as a tool when privacy is not an option. For example, a whistleblower who is leaking secret documents might desire that they remain anonymous but would not want the documents to stay private, as that would defeat the purpose of leaking them; the point is that they enter the public sphere. Anonymity can be an important component of privacy and privacy can help us maintain anonymity, but they are distinct ideas. I will expand more on this in future writing.

Encrypted/Unencrypted: Encrypted means a file, message or other piece of data/info is locked so that it becomes unreadable without the correct decryption key. Both static files on a device and data which is in motion, like an email, web traffic or instant message, can be encrypted. End-to-end encryption (E2EE) is a term used to describe data in motion which is encrypted the whole time it is traveling. Only the sender and recipient’s devices have unencrypted versions of that data, no intermediate person would be able to read it. One of my upcoming posts this month will be a deeper dive into the nuances of encryption because it’s too big of a topic to discuss adequately here. Encrypted does not mean inherently secure or private as there are lots of different kinds of encryption and it’s a term which is often misused by companies to represent their product as secure, private or anonymous when it is not.

Privacy as Mutual Aid.

If there’s one thing that’s clear to me after spending countless hours since mid-June poring over the contents of recently leaked internal police documents, it’s that our digital networks intrinsically link us in ways that mean that if one person’s privacy and security are compromised, they become a conduit for attacks on everyone else in their social network. Furthermore, the state is aware of this and actively exploiting our social groups. We can no longer afford to ignore it; privacy practices must be a part of our digital etiquette. Corporate tech overlords, state surveillance apparatuses, and data harvesting companies all have their hands in our pockets literally all the time. In 2020, this means more than just inconvenience or uneasiness, it literally can mean the difference between life and death, freedom and detention (as seen here, here, here, here, here, and many more). The only way any one of us can begin to shake the looming threat of surveillance is if we do it together because we are all connected all the time. Privacy can and should be viewed as a form of radical mutual aid. It’s not about you or me as individuals, privacy is about preserving our collective freedom; from each according to their abilities, to each according to their need.

It has been so encouraging to see so many people become interested in the concept of mutual aid as the pandemic has changed our relationships to one another. Prior to this year, the term mutual aid was rarely heard outside anarchist spaces, now it’s a topic of discussion in the mainstream media – that’s exciting. Mutual aid has clearly taken root as a core part of the growing anti-authoritarian movement in the US and across the globe. My hope in writing this is that those same gains made by applying mutual aid theory to things like grocery distribution and community care can be made in the realm of privacy and security culture.

Common Struggle.

One of the most important parts of mutual aid is the basis of common struggle. The fundamental understanding that we are in this together is often a guiding principle in anti-authoritarian work and it gives us reason to coordinate mutual aid efforts. We certainly experience different intersections of oppression and liberation is absolutely not uniform or homogeneous but operating from the basis that our different kinds of struggle are intertwined with one another means we can work together to liberate each other simultaneously. If you’re not a member of the state or the wealthy, you are subject to their policing, oppression and control. If you’re subject to their policing, oppression and control, you’re subject to their surveillance. If you’re subject to their surveillance, we’re in this together. Simple as that. As long as we divide our own struggles, as long as we feel we aren’t responsible to one another, as long as we fail to see ourselves as connected to our communities and social networks, we are compromising our own ability to fight together.

The most common rationalization which prevents us from pursuing more private, anonymous and secure digital habits is the idea that a person who has nothing to hide should not be afraid of surveillance. This argument is widely employed by governments to justify state surveillance; the state tells us that invasion of our privacy is only a threat if we are Doing Something Wrong. I am using capitals here to make the distinction between actually doing something which is morally wrong, and Doing Something Wrong as defined by the state. The argument could be made for abolition of the moralistic stance that anything is morally wrong, but I’ll leave that argument for others to have. For almost ten years we’ve known that the NSA employs dragnet surveillance to collect every single conversation they can get their hands on, yet they tell us we shouldn’t be worried unless we’re Doing Something Wrong. We’re trained to willingly accept invasive surveillance, and further, to be excited at the prospect that said surveillance might lead to the capture of the bad guys (those who are Doing Something Wrong).

Who decides who the bad guys are, and how do we know we’re not them? What happens when we begin to understand that the bad guys are a moving target created by the state to preserve the legitimacy of the state? I am not advocating that we all feel that we’re actually always doing something wrong, but I am advocating that we acknowledge that in the eyes of the state everyone is potentially Doing Something Wrong. If we imagine ourselves as people who are not Doing Something Wrong and thereby do not see digital surveillance as harmful we endanger those in our networks who do find surveillance to be a risk, even if they make efforts to lock down their own privacy. In relation to privacy, common struggle means that we have to acknowledge surveillance is a risk to all of us, not just those who are Doing Something Wrong. We may experience different risks from lack of privacy, just as we experience different kinds of oppression, but we must acknowledge that we are all harmed or made at risk of harm by any one of us not having autonomous control over our data, thereby privacy must be a mutual aid effort.

None of us can know all the laws.

The most glaring problem with the “nothing to hide” argument is that the state constantly creates, rewrites and alters the definition of Doing Something Wrong. There simply is no possible way for us to know if we are doing something wrong, because there are too many laws and the laws are in constant flux, not to mention the fact that most cases are basically totally up to the discretion of the individuals involved in the criminal (in)justice system. Lawyers, judges and even supreme court justices have acknowledged that there are basically just too many laws for any one person to be aware of. This is clear reason to believe that we simply can never be certain we aren't Doing Something Wrong. Supreme Court Justice Stephen Breyer's statement in a 1994 court case explains this well.

“First, the complexity of modern federal criminal law, codified in several thousand sections of the United States Code and the virtually infinite variety of factual circumstances that might trigger an investigation into a possible violation of the law, make it difficult for anyone to know, in advance, just when a particular set of statements might later appear (to a prosecutor) to be relevant to some such investigation.”

There are nearly 30,000 pages of federal law. The possibility that something in your email inbox, call logs, text messages or search history could be construed to be in violation of one of the more than ten thousand federal laws is highly likely if not certain. The 1986 Computer Fraud and Abuse Act (CFAA), which is the primary document upon which computer crime prosecutions are based, is so vague and riddled with loopholes that it allows a prosecutor to find many commonplace computer habits criminal. The point I'm getting to here is that even if only those who are Doing Something Wrong have something to hide, that still implicates pretty much all of us because we can never be certain we aren’t Doing Something Wrong. This argument has been made repeatedly in favor of privacy practices, but somehow the “nothing to hide” myth remains central the persistent justification of surveillance.

Having something to hide ≠ doing something wrong.

Another point of weakness in the “nothing to hide” myth is the idea that having something to hide is contingent upon Doing Something Wrong. Even if at this point you still feel you definitely are not Doing Something Wrong, you still have something to hide. Suppose, for example, you have saved credit card information logged in an account on an app. If that card information is unencrypted on the app’s servers (which it would likely be legally required to be if the proposed EARN IT Act is signed into law), their servers could be hacked and your card info could be easily leaked publicly. In the event your card information shows up in a list of breached card credentials, you find out very quickly that you do indeed have something to hide. Privacy, anonymity and encryption are necessary for everyone in everyday interactions Just like escrow isn’t just for dark web markets, encryption isn’t reserved to secretive military comms, it’s pretty much everywhere in our digital world. You don’t have to be Doing Something Wrong to want privacy and your desire not to publish your credit card info illustrates this perfectly.

The same argument could be made for any private information we store digitally. If you aren’t comfortable with all of your emails being published in your towns newspaper for all to see, you have something to hide. If you don’t want to publicly post the content of your phone’s gallery, you have something to hide. If you send a text or picture to an intimate partner that you wouldn’t send to your whole contact list, you have something to hide. If you are a businessperson and you don’t want audio recordings of all of your meetings mailed to your competitors, you have something to hide. The list could go on and on, but hopefully these examples make clear how unjust it is to equate having something to hide and Doing Something Wrong. It’s also important to note that having something to hide shouldn’t be a source of shame for us either; the myths which are used to justify surveillance and invasion of privacy often make us feel ashamed that we desire privacy. We all have something to hide, which means we deserve transparency from the companies we trust with our data, and I feel we owe it to ourselves and each other to invest in privacy practices.

We have to care, even if we don’t feel we’re the target.

Even if you feel you don’t have anything to hide, and you’ve done nothing wrong, your privacy is directly tied to that of everyone you interact with. Even if you are totally comfortable publishing your own credit card information, home address, browser history, nude photographs of yourself, your medical records, your texts and emails etc, your network includes people who would not do the same. We owe it to each other to determine whether our privacy practices are increasing or reducing risk for ourselves and our networks. We have the opportunity to leverage our knowledge to reduce harm instead of creating it.

Imagine this scenario. Alex is a person who feels he has nothing to hide; Alex feels that even if someone were to get their hands on his data, it wouldn’t matter. He doesn’t see himself as a criminal, he doesn’t think he has broken any laws, and he doesn’t have anything he feels is really private on any of his digital devices.

Alex has a friend named Sarah who is seriously concerned about her privacy. Sarah is an organizer with a group that advocates for human rights. Her work is legal and should be uncontroversial, but fascist groups keep targeting Sarah’s organization with harassment, death threats and intimidation.

Alex communicates with Sarah from the stock text messaging app as he does not want to download any encrypted chat platforms because it’s a hassle. He has her number and her email saved in his contacts under her full name. Alex uses a free contact management app from the app store which backs up his contacts to a cloud server so if he loses his phone he will still be able to have all his contacts. That app stores the contacts on its servers in cleartext, meaning it’s unencrypted. After a few months of using the app Alex gets an email from the developers saying that they’ve been hacked and 23 million contacts were breached. Covve, a popular contact management app had that exact amount of contacts leaked just three months ago. Sarah’s contact information was in that breach and the breach data is publicly available online. Fascists who are trying to disrupt the organizing Sarah is involved in find her phone number and email online because of the breach. This information can now be used to conduct different kinds of attacks; credential stuffing, account exploitation, SIM swapping, blackmail, etc…

Despite the fact that Sarah uses only apps with zero knowledge end-to-end encryption, never publicly shares her email or phone number, and works tirelessly to make sure she is private and protected online, her information still wound up leaked. Alex’s disregard for his own privacy ultimately ended up endangering his friend. This hypothetical situation serves to illustrate one of the infinite ways our own practices are directly tied to the safety and security of everyone we digitally communicate with.

Solidarity, Not Charity.

Mutual aid is not charity. By recognizing our common struggle we can begin to counteract the pedantic structures of charity. Mutual aid means we (those who are in the struggle) are helping each other rather than waiting for those in power, who impose harm and struggle upon us, to alleviate the harm they impose on us through charity. The fundamental structure of charity is built to help them feel righteous and to validate their privilege. The idea of counteracting charity by building solidarity fits perfectly into the conversation about privacy as mutual aid; nearly all the infrastructure of the internet and just about every piece of digital technology was created collaboratively by open source developers or based on Free and Open Source Software(FOSS). The world of big tech benefits from our lack of knowledge about our digital devices, and the iPhone is a beloved talking point for those arguing in favor of capitalism, but the reality is that we wouldn’t have most of the technology we do have if it weren’t for collaboration, open source ideology and some pretty anti-capitalist practices in early computer development.

Most, if not all, privacy researchers and advocates would recommend always using FOSS applications. Open source means the code which makes up the software is published for anyone to read. While I may not be able to understand any of the code that makes up an app like Signal, for example, there are certainly researchers and companies which do independent audits of open source privacy oriented software to make sure there are no bugs, callbacks or flaws which would threaten the privacy of the user. If a developer or company publishes a closed source app and claims it is private, encrypted or secure, we can never really be sure their claims are true because we do not know what’s happening in the code.

Using only FOSS applications also means that when there are inevitable bugs and flaws they are caught quickly and fixed. This is also why it’s good digital etiquette to update apps as soon as new versions are released, as they often contain patches for bugs in the code.

For me, journeying into the world of digital privacy has been incredibly empowering because it has given me more understanding of and control over my devices and digital presence. We don’t have to beg tech companies like Facebook and Google to make their platforms safer and more private, and they wouldn’t do it even if we did! We don’t have to trust closed source proprietary software with our sensitive data, asking that they pretty please keep our data safe. We have the power to make ourselves more secure, and to reduce harm for those in our networks in doing so; making our own digital practices more private and secure is an act of mutual aid.

This post was originally written by MW and published on the clearnet (warning) at mappingwatchtowers.substack.com
This post is found on tor at http://writeas7pm7rcdqg.onion/m-w/we-are-all-connected
This blog is found on tor at http://writeas7pm7rcdqg.onion/m-w/
This post is mirrored on the clearnet (warning) at https://write.as/m-w/we-are-all-connected
This blog is mirrored on the clearnet (warning) at https://write.as/m-w/
If you’d like to support further writing, subscribe via MW's paid substack, or make donations via BTC to 3PnjHL8kwGaTFbgYoBtKLUasKqv2khJq4R
With love and in solidarity,
MW