Submit to your local DSC chapter CFPSubmit now!
close

SecAdvent

Privacy, Glitter and Unicorns – SecAdvent Day 2

December 2, 2020

Privacy, Glitter and Unicorns – SecAdvent Day 2
Kim Wuyts
Sharefacebookmail_outlinelink

Privacy. Privacy by design. Privacy engineering.

glitter on floor image

Terms that are unfortunately still sometimes received as if I am telling fairy tales.

Who needs privacy? I have nothing to hide! 

But privacy is important, especially in the digital world. And it goes far beyond encryption and newsletter consent.

So, let me debunk some myths about privacy. With glitter and unicorns on the side.

 

It’s all about the data

Your financial information. Your health history. Your daily routines.

These are all personal data that, in the offline world, you typically only want to share with some close friends or family. And, even then, you will most likely not share with every one of them all the details about what you eat every day, what you watch on TV, what you like and dislike, what you buy, etc.

As an individual, you want to control who knows what about you. This often depends on context; you will, for instance, share other information with your colleagues than with your friends.

Personal data is the new glitter

The same holds in the digital world. Apparently, though, different norms are being applied. Data can be considered the new glitter. As it is so easy to get hold of and has a certain level of shininess that draws people in, companies insist on gathering it – even if they do not really need it. And they often do not think through the consequences. Just like glitter is nearly impossible to get rid of, collection and processing of personal data is hard to be undone, even when this processing might not have been permitted.

Personal data should only be collected, processed, or shared when there are legitimate grounds to do so, or when the individuals involved provide their explicit and informed consent.

Privacy is not a synonym for security

We all like certain information not to be (easily) traced back to us. For example, we want to be able to deny we have done certain actions. That means that there should not be a (direct) link between the data and the individual.

This link to individuals is, however, established not only by data attributes that are obvious identifiers, such as social security numbers, and the combination of full name and address or date of birth. In fact, almost all attributes can be quasi-identifiers. Whenever a combination of attributes is sufficiently unique to single out an individual, the corresponding data items can be identified.

Let me give a fairy tale example. Assume I tell you a secret about an ‘anonymous’ girl who likes apples, sleeps a lot, and has short friends. It is quite obvious that I am talking about Snow White, right?

So, privacy is clearly not a synonym for confidentiality. Privacy is much broader than that. Even a perfectly secured system (if that would even exist) can still violate the privacy of its users.

The core of privacy is actually captured by three main privacy protection goals: unlinkability, intervenability, and transparency. These complement security’s CIA triad (i.e., confidentiality, integrity and availability).

The Snow White example above illustrates the unlinkability goal. Especially across domains, linkability should be avoided. Unlinkability is obviously closely related to data minimization.

The intervenability goal captures the individuals’ right to control what happens to their data. Think for instance of the privacy settings on social media that allow users to decide who gets to see their posts. The right to be forgotten also falls under this category.

Transparency is about the individuals’ right to know what is happening to their information – why it is being collected, with which 3rd parties it will be shared, etc.

Privacy is not security’s enemy

Security people often consider privacy a burden. They are convinced that privacy would conflict with their security requirements.

Let’s take ‘plausible deniability’ as an example. It is a property of privacy, but a threat for security. This can indeed seem conflicting, but it shouldn’t. Even in a system where both plausible deniability and non-repudiation are needed, these requirements apply to different parts or functionalities of the system.

An online voting system, for instance, requires plausible deniability about the content of the actual vote (e.g. it should not be possible to prove for whom someone voted). At the same time, there can be a non-repudiation requirement about the act of voting (e.g. proving someone has voted). Plausible deniability and non-repudiation can thus both be achieved simultaneously without conflicting.

Also, a privacy requirement for anonymous use of a system and an authentication requirement for security do not need to conflict. Several privacy-enabling mechanisms, such as anonymous and pseudonymous credentials, allow authentication without requiring the users’ identity.

Personal data should only be collected, processed, or shared when there are legitimate grounds to do so, or when the individuals involved provide their explicit and informed consent.

Privacy should not be an afterthought

The key to avoid conflicts between privacy and security is the integration and alignment of both aspects early in the development cycle. Obviously, if you consider privacy only as an add-on afterwards, you will be in for much overhead and conflict. This can be easily avoided.

How can you integrate privacy in development practice? Preferably early, frequently, and in a structured and systematic way. Threat modeling is a well-known practice to improve the security and privacy of a software system.

Threat modeling to the rescue

As defined in the Threat Modeling Manifesto, threat modeling means analyzing representations of a system to highlight concerns about security and privacy characteristics. You think about what can go wrong, in a systematic way, by analyzing a model representation of the system. This way, you identify and fix security and privacy issues before they actually happen.

The outcome of the threat model informs decisions for all subsequent design, development, testing and post-deployment phases. It therefore has great value and impact throughout the system’s development.

So, which threat modeling method should you use? Is there a mythical tool or technique that will effortlessly identify and mitigate all privacy and security issues?

Wouldn’t such a fairy tale solution be great! Imagine a magical unicorn that comes flying in, and straightens out all the glitter. I hate to kill the fantasy, but unicorns don’t exist, one-size-fits-all solutions don’t exist. The method or tool that is perfect for one person (or company) might not work for you. It all depends on your requirements, your preferences, and your level of expertise.

Start with the basics and get your threat modeling foundation straight. Use the Threat Modeling Manifesto as a guide, and identify and refine the methodology that fits your needs. As one of the manifesto’s patterns states, use successfully field-tested techniques aligned to local needs, and that are informed by the latest thinking on the benefits and limits of those techniques.

If you are looking for a pointer to get started with privacy threat modeling, I can recommend LINDDUN and its lightweight variant, LINDDUN GO (but of course, I am biased).

Privacy, glitter & unicorns – a summary

  • Just like glitter, handle personal data with care.
  • Take privacy into account early, frequently, and in a structured way. Threat modeling can be instrumental.
  • Don’t expect a magical unicorn to straighten out all the glitter. There is no magical tool or technique that fits everyone’s needs. Apply the key threat modeling values and principles, and use them to find and refine the method that fits your needs.

 

Want to know more?

 

“Photo of Glitter Particles” by Inkwana. Available at https://commons.wikimedia.org/wiki/File:Glitter_close_up.jpg

Kim Wuyts

About Kim Wuyts

Kim Wuyts is an academic privacy researcher at imec-DistriNet, KU Leuven in Belgium. She is one of the driving forces behind the LINDDUN threat modeling framework.

Tags:

Data Protection
Privacy
SecAdvent
Security
Threat Modeling
We use cookies to ensure you get the best experience on our website.Read Privacy Policy
close