Encryption, Privacy, and Security

9 min read

In conversations about student data privacy, the terms "encryption," "security," and "privacy" are often used interchangeably. While these terms are related, they ultimately are distinct concepts. In this post, we will break down how these terms overlap with each other, and how they are distinct.

But at the outset, I need to emphasize that this post will be incomplete - a comprehensive treatment of these terms and the distinctions between them would be a good subject for a book. Details will be left out. If you're not okay with that, feel free to stop reading now. I imagine that the Kardashians are up to something curious or interesting - feel free to check that out.

As is hopefully obvious by now, this post is not intended to be comprehensive. This post is intended to provide a starting point for people looking to learn more about these concepts.

Privacy

Privacy is arguably the least technical element in this conversation. There are two facets to privacy we will highlight here:

  • It's possible to have great security and bad privacy practices; and
  • We often speak about "privacy" without clarifying "private from whom."

Great security and bad privacy

A vendor can go to extreme lengths to make sure that data can only be accessed by the vendor, or the partners of the vendor. However, if the vendor reserves the right to sell your data to whomever they want, whenever they want, that's not great for your privacy. The ways that vendors can use the data they acquire from you are generally spelled out in their terms of service - so, if a vendor reserves rights to share and reuse your data in their terms, and you agree to those terms, you have given the vendor both data, and the permission to use that data.

There are many vendors who have solid security paired with privacy policies and data usage practices that compromise user privacy.

Who is that private from, really?

Different people think of different things when we say the word "private" - in most cases, when we think about privacy, we focus on things we don't want other people to know. When we are working with technology, though the concept of "other people" gets abstract and impersonal pretty quickly.

When we use services that store a record of what we have done (and it's worth noting that "doing" means read, said, searched for, liked, shared, moused over, and how long we have done any of these things), the "private" things we do are handed over to systems that have a perfect memory. This changes the nature of what "private" can mean. For the purposes of this post, we'll use four different categories of people who might be interested in us over time, and how that impacts our privacy.

  • Criminal - these are the folks people agree about the most: the people stealing data, perpetrating identity theft, and using a range of attacks to get unauthorized access to data with bad intent.
  • Personal - there is also large agreement about personal privacy. We can all agree that we don't want Great Uncle Wilfred to know about our dating life, or to talk about it during Thanksgiving. The ability to control which of our acquaintances knows what is something we all want.
  • Corporate - there is less agreement here, as one person's desire for privacy often runs counter to a data broker's or a marketers business plan. But, when using a service like Facebook, Instagram, Twitter, Snapchat, Pinterest, etc, the "privacy settings" provided by the vendor might offer a degree of personal privacy, but they do nothing to prevent the vendor from knowing, storing, and profiting from everything you do online. This often includes tracking you all over the web (via cookies and local shared objects), in real life (via location information collected via a mobile app), or from buying additional data about you from a data broker.
  • State - there is also less agreement about what constitutes an appropriate level of protection or freedom from state sponsored surveillance. While people have been aware of the inclination of the state to violate privacy in the name of security and law enforcement throughout history, the Snowden leaks helped create specific clarity about what this looked like in the present day.

(As an aside, the data use practices within politics should possibly be included in this list.)

Many conversations about privacy don't move past considering issues related to criminal activity or personal compromises. However, both corporate and state level data collection and use expose us to risk. As was recently illustrated by the Ashley Madison and the OPM breaches, corporate data collection and state data collection pose criminal and personal risk.

For people looking to learn more about the various factors at play in larger privacy conversations, I strongly recommend Frank Pasquale's recent book, the Black Box Society. The book itself is great, and the footnotes are an incredible source of information.

Security

In very general terms, security can be interpreted to mean how data is protected from unauthorized access and use. Encryption is a part of security, but far from the only part. If a systems administrator leaves his username and password on a post-it note stuck to his monitor, that undercuts the value of encrypting the servers. Human error can result in snafus like W2s for a popular tech startup being emailed to a scammer.

If people email passwords to one another - or store passwords online in a Google Spreadsheet - a system with fantastic technical security can be compromised by a person who has limited technical abilities but who happens to stumble onto the passwords. Phishing and social engineering attacks exploit human judgement to sidestep technical security measures. If a csv file of user information is transferred via Spider Oak and then copied to an unencrypted USB key, the protection provided by secure file transfer is immediately destroyed by storing sensitive information in plain text, on a portable device that is easy to lose. In short, security is the combination of technical and human factors which, taken together, decrease the risk of unauthorized access or use of information.

Encryption is an element of security, but not the only element. It is, however, a big part of the foundation upon which security, and our hopes for privacy, rests.

Encryption

Encryption is often used in general terms, as a monolithic construct, as in: "We need to fight to protect encryption" or "Only criminals need encryption."

However, the general conversation rarely gets into the different ways that information can be encrypted. Additionally, there are differences between encrypting a device (like a hard drive), data within an app, and data in transit between an app and a server or another user.

As an example, all of the following questions look at possible uses of encryption for a standard application: does an application encrypt data at rest on the device where the data is stored? If the application pushes data to a remote server for storage, is the data encrypted while in transit to and from the remote location? If the data is stored at the remote location, is the data encrypted while at the remote location? If the remote location uses multiple servers to support the application, is communication between these servers encrypted?

If the answer to any of these questions is "no" then, arguably, the data is not getting the full benefits of encryption. To further complicate matters, if a vendor encrypts data at rest, and encrypts data moving between servers, and encrypts data moving between servers and applications, but that vendor can still decrypt that data, then there is no guarantee that the benefits of encryption will protect an individual user. When vendors can decrypt the data on their hardware, then the data is only as secure - and the information stored only as private - as the vendor is able or willing to protect that encryption.

True end to end encryption (where the data is encrypted before it leaves the application, is sent via an encrypted connection, and only decrypted at its final destination) is the ideal, but often a vendor will function as a middleman - storing and archiving the data before sending it along to its intended recipient. This is one of many reasons that the encryption debate looks different for vendors that make hardware relative to vendors that build software.

In very general terms, hardware manufacturers fighting for encryption are protecting user data; and it's in the best interest of these manufacturers to protect user data because if hardware vendors fail to protect user data they also lose user trust, and then people won't buy their products.

In equally general terms, many application vendors fighting for encryption have a more complicated position. A small number of vendors have been vocal supporters of encryption for years - these are the small number of vendors who offer true end to end encryption, or who implement encryption where the user, not the vendor, retains control of their keys. However, the ongoing legal battle between Apple and the FBI over encryption has elicited broad support from within the tech community, including companies who use data to power advertising and user profiling. For companies whose business is predicated on access to and use of a large dataset of sensitive user information, strong encryption is essential to their business interests.

In their external communications, they can get a public relations win by advancing the position that they are defending people's right to privacy. Internally, however, encryption protects the biggest asset these companies possess: the data sets they have collected, and the communications they have about their work. This is where the paradox of strong security with questionable privacy practice comes into play: why should encryption give large companies an additional tool to protect the means by which they compromise the privacy of individuals?

And the answer is that, without encryption available to individuals, or small companies, none of us have a chance to enjoy even limited privacy. If we - people with less access to technical and financial resources than the more wealthy or connected - want to have a chance at maintaining our privacy, encryption is one of the tools we must have at our disposal. The fact that it's also useful to companies that make a living by mining our information and - arguably - violating our privacy doesn't change the reality that encryption is essential for the rest of us too.

NOTE: I'd like to thank Jeff Graham for critical feedback on drafts of this piece.