By Daniel J. Solove
“We’re building privacy into the architecture from the ground up,” various companies and government entities often say when designing products, programs, and services.
Sometimes, they’ll use a cooking analogy: “We’re baking privacy into the cupcake, and it’s going to be delicious!”
These are statements about what is now referred to as “Privacy by Design,” which involves the embedding of privacy into the architecture of various products, programs, or services.
The term “Privacy by Design” (abbreviated PbD) was coined by Ann Cavoukian, the former Information and Privacy Commissioner of Ontario, Canada and now Executive Director of the Privacy and Big Data Institute at Ryerson University where she has created a Privacy by Design Certification.
Nearly two decades ago, Professor Joel Reidenberg (Fordham Law School), wrote an article called Lex Informatica: The Formulation of Information Policy Rules through Technology, 76 Tex. L. Rev. 553 (1997) in which he argued that “Technological capabilities and system design choices impose rules on participants.”
The key ideas here are that technological design involves choices and that design matters.
Several others contributed to the development of Privacy by Design. It is a powerful and important notion, one that today has become commonplace in the field of information privacy. A Google search for “Privacy by Design” yields about 200,000 results.
I have a few points to add to the discussion, points that I hope will emphasize the importance of understanding Privacy by Design as well as respecting how challenging it can be.
1. Design choices are not fixed.
Products, programs, of services that implicate privacy do so because of design choices. Privacy by design is architecting of things with privacy in mind. Many things get built without much thought about privacy, so the recognition that design choices affect privacy is a good first step. Choices about design should not be taken as a given.
2. Most design choices implicating privacy are not value neutral and should be understood in moral terms.
There is an oft-repeated maxim that “technology is value neutral.” In the abstract, this might have some validity. But particular technologies are not value neutral — moral choices are made in their design. Privacy is one such choice. Technologies that implicate privacy in a particular way are designed. Design choices about privacy are deliberately made – and these have a moral valence. Even when choices implicating privacy are not deliberately made, there can be a moral valence. The failure to consider privacy implications can be due to carelessness or thoughtlessness. Of course, there are unanticipated consequences that are not reasonably foreseeable, and we can forgive these. But in many cases, insufficient attention goes into thinking about the privacy implications of design choices, and this should be understood in moral terms.
3. Privacy by Design is only as good as the underlying conception of privacy.
Far too often, I see decisions being made about privacy without much thought about the underlying conception of privacy. I see this in judicial decisions, statutes, as well as self-regulatory policies.
All decisions regarding privacy depend upon a conception of privacy. If the conception of privacy is poor or incomplete, the decisions will be bad. In many instances, the conception of privacy is not explicitly stated – probably because very little thought was given to it. But there is at least an implicit conception of privacy at play in these instances. Just look at what is protected and how, as well as what is not protected, and you can start to see the contours of the conception of privacy involved.
But so many times when I hear about Privacy by Design, I hear little about the conception of privacy. I hear statements such as “we’re designing for privacy” or “we’re going to bake privacy in.” But what is the “privacy” that is being designed or baked? Without a conception of privacy, Privacy by Design is like designing a building without a blueprint or baking a cupcake without a recipe.
In many cases, the “privacy” being designed or baked in is missing some dimensions. The “privacy” the designers have in mind might be so focused on one particular dimension of privacy that it might overlook many other dimensions. Or designers think they can just toss in a few Fair Information Practice Principles and call it a privacy cake.
4. Privacy by Design is difficult.
Privacy by Design is not easy. It is more than just throwing an ingredient or two into the cake. It involves a thorough consideration of what privacy is and how it is implicated in design choices. Privacy by Design is hard because privacy is complicated. Instead of a single binary choice, privacy involves an array of different things that must be considered. So doing Privacy by Design well is a challenge . . . basically, it takes a Zen master.
Even if not complete or even with shortcomings, good faith attempts at Privacy by Design can still be a huge step forward. But Privacy by Design should be approached with humility and with an appreciation for its profound importance, its moral dimension, and its significant difficulty.
If you are interested in more about these issues, I have written extensively about how to understand privacy.
I also have developed a Privacy by Design training course.
This post was authored by Professor Daniel J. Solove, who through TeachPrivacy develops computer-based privacy training, data security training, HIPAA training, and many other forms of awareness training on privacy and security topics. This post was originally posted on his blog at LinkedIn, where Solove is a “LinkedIn Influencer.” His blog has more than 900,000 followers.
Professor Solove is the organizer, along with Paul Schwartz of the Privacy + Security Forum (Oct. 21-23 in Washington, DC), an event that aims to bridge the silos between privacy and security.